# LlamaIndex Core
The core python package to the LlamaIndex library. Core classes and abstractions
represent the foundational building blocks for LLM applications, most notably,
RAG. Such building blocks include abstractions for LLMs, Vector Stores, Embeddings,
Storage, Callables and several others.
We've designed the core library so that it can be easily extended through subclasses.
Building LLM applications with LlamaIndex thus involves building with LlamaIndex
core as well as with the LlamaIndex [integrations](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations) needed for your application.
Raw data
{
"_id": null,
"home_page": "https://llamaindex.ai",
"name": "llama-index-core",
"maintainer": "Andrei Fajardo",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": "andrei@runllama.ai",
"keywords": "LLM, NLP, RAG, data, devtools, index, retrieval",
"author": "Jerry Liu",
"author_email": "jerry@llamaindex.ai",
"download_url": "https://files.pythonhosted.org/packages/b6/24/a86dfa9445acea8ff4b0d5afcdec745deb33a7658bb433d09e12f451a01d/llama_index_core-0.12.8.tar.gz",
"platform": null,
"description": "# LlamaIndex Core\n\nThe core python package to the LlamaIndex library. Core classes and abstractions\nrepresent the foundational building blocks for LLM applications, most notably,\nRAG. Such building blocks include abstractions for LLMs, Vector Stores, Embeddings,\nStorage, Callables and several others.\n\nWe've designed the core library so that it can be easily extended through subclasses.\nBuilding LLM applications with LlamaIndex thus involves building with LlamaIndex\ncore as well as with the LlamaIndex [integrations](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations) needed for your application.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Interface between LLMs and your data",
"version": "0.12.8",
"project_urls": {
"Documentation": "https://docs.llamaindex.ai/en/stable/",
"Homepage": "https://llamaindex.ai",
"Repository": "https://github.com/run-llama/llama_index"
},
"split_keywords": [
"llm",
" nlp",
" rag",
" data",
" devtools",
" index",
" retrieval"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "83ae6218c7ecf6bd671d5e1412bea672557d416f2f8edd6db76e08dac9cb53d3",
"md5": "2be37ec899d9311a1364c4d0614ba976",
"sha256": "7ebecbdaa1d5b6a320c050bf90525605ac03b242d26ad55f0e00a0e1df69e070"
},
"downloads": -1,
"filename": "llama_index_core-0.12.8-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2be37ec899d9311a1364c4d0614ba976",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 1583226,
"upload_time": "2024-12-21T01:19:27",
"upload_time_iso_8601": "2024-12-21T01:19:27.835327Z",
"url": "https://files.pythonhosted.org/packages/83/ae/6218c7ecf6bd671d5e1412bea672557d416f2f8edd6db76e08dac9cb53d3/llama_index_core-0.12.8-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b624a86dfa9445acea8ff4b0d5afcdec745deb33a7658bb433d09e12f451a01d",
"md5": "99859a23579ff0bf8b6300f815e50da0",
"sha256": "3b360437b4ae47b7bd1733f6492a95126e6739c7a2fd2b649ebe8bb3afea7143"
},
"downloads": -1,
"filename": "llama_index_core-0.12.8.tar.gz",
"has_sig": false,
"md5_digest": "99859a23579ff0bf8b6300f815e50da0",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 1332649,
"upload_time": "2024-12-21T01:19:32",
"upload_time_iso_8601": "2024-12-21T01:19:32.513942Z",
"url": "https://files.pythonhosted.org/packages/b6/24/a86dfa9445acea8ff4b0d5afcdec745deb33a7658bb433d09e12f451a01d/llama_index_core-0.12.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-21 01:19:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "run-llama",
"github_project": "llama_index",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "llama-index-core"
}