# LlamaIndex Core
The core python package to the LlamaIndex library. Core classes and abstractions
represent the foundational building blocks for LLM applications, most notably,
RAG. Such building blocks include abstractions for LLMs, Vector Stores, Embeddings,
Storage, Callables and several others.
We've designed the core library so that it can be easily extended through subclasses.
Building LLM applications with LlamaIndex thus involves building with LlamaIndex
core as well as with the LlamaIndex [integrations](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations) needed for your application.
Raw data
{
"_id": null,
"home_page": "https://llamaindex.ai",
"name": "llama-index-core",
"maintainer": "Andrei Fajardo",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": "andrei@runllama.ai",
"keywords": "LLM, NLP, RAG, data, devtools, index, retrieval",
"author": "Jerry Liu",
"author_email": "jerry@llamaindex.ai",
"download_url": "https://files.pythonhosted.org/packages/b5/5d/653ff8a89136b99c707e19040cf6ab8131f78fee54f43886e497b675a017/llama_index_core-0.12.1.tar.gz",
"platform": null,
"description": "# LlamaIndex Core\n\nThe core python package to the LlamaIndex library. Core classes and abstractions\nrepresent the foundational building blocks for LLM applications, most notably,\nRAG. Such building blocks include abstractions for LLMs, Vector Stores, Embeddings,\nStorage, Callables and several others.\n\nWe've designed the core library so that it can be easily extended through subclasses.\nBuilding LLM applications with LlamaIndex thus involves building with LlamaIndex\ncore as well as with the LlamaIndex [integrations](https://github.com/run-llama/llama_index/tree/main/llama-index-integrations) needed for your application.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Interface between LLMs and your data",
"version": "0.12.1",
"project_urls": {
"Documentation": "https://docs.llamaindex.ai/en/stable/",
"Homepage": "https://llamaindex.ai",
"Repository": "https://github.com/run-llama/llama_index"
},
"split_keywords": [
"llm",
" nlp",
" rag",
" data",
" devtools",
" index",
" retrieval"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5424cad9153b5aa715ac5b9d079e2474ec846f85dc90136e86e1ea0aed800967",
"md5": "6e1477000a7588c4805a08bcc55673b7",
"sha256": "6de135c890f711a0cf1f80d98d6981015fd14a3864c48e087d91f7e845d8a0a3"
},
"downloads": -1,
"filename": "llama_index_core-0.12.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6e1477000a7588c4805a08bcc55673b7",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 1578184,
"upload_time": "2024-11-20T22:39:27",
"upload_time_iso_8601": "2024-11-20T22:39:27.741229Z",
"url": "https://files.pythonhosted.org/packages/54/24/cad9153b5aa715ac5b9d079e2474ec846f85dc90136e86e1ea0aed800967/llama_index_core-0.12.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b55d653ff8a89136b99c707e19040cf6ab8131f78fee54f43886e497b675a017",
"md5": "d35f3d79fce28c9194be2a66fe16a667",
"sha256": "3abcd68b019cfd58e79b07e227925cbbc86283dae1d8ab35403ff2835e5099f8"
},
"downloads": -1,
"filename": "llama_index_core-0.12.1.tar.gz",
"has_sig": false,
"md5_digest": "d35f3d79fce28c9194be2a66fe16a667",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 1327794,
"upload_time": "2024-11-20T22:39:30",
"upload_time_iso_8601": "2024-11-20T22:39:30.634043Z",
"url": "https://files.pythonhosted.org/packages/b5/5d/653ff8a89136b99c707e19040cf6ab8131f78fee54f43886e497b675a017/llama_index_core-0.12.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-20 22:39:30",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "run-llama",
"github_project": "llama_index",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "llama-index-core"
}