<p align="center">
<img align="center" src="docs/docs/static/img/dspy_logo.png" width="460px" />
</p>
<p align="left">
## DSPy: _Programming_—not prompting—Foundation Models
**Documentation:** [DSPy Docs](https://dspy.ai/)
[![Downloads](https://static.pepy.tech/badge/dspy-ai)](https://pepy.tech/project/dspy-ai) [![Downloads](https://static.pepy.tech/badge/dspy-ai/month)](https://pepy.tech/project/dspy-ai)
----
DSPy is the open-source framework for **building high-quality, modular AI systems** by _programming—rather than prompting—language models_. It provides abstractions and algorithms for **optimizing the prompts and weights** in LM programs, ranging from simple classifiers to sophisticated RAG pipelines and Agent loops.
Instead of writing brittle LM-specific prompts, you write compositional code and use DSPy optimizers to teach different models like `GPT-4o` or `Llama-3.2` to **deliver higher quality outputs** or avoid specific failure patterns. In essence, DSPy optimizers then _compile your high-level code_ into low-level computations, prompts, or weight updates that **align your LM with your program’s structure and metrics**.
DSPy stands for Declarative Self-improving Python. This [recent lecture](https://www.youtube.com/watch?v=JEMYuzrKLUw) is a good conceptual introduction. Our [Discord server](https://discord.gg/XCGy2WDCQB) is a great place to meet the community, seek help, or start contributing.
## Documentation: [dspy.ai](https://dspy.ai)
**Please go to the [DSPy Docs at dspy.ai](https://dspy.ai)**
## Installation
```bash
pip install dspy
```
To install the very latest from `main`:
```bash
pip install git+https://github.com/stanfordnlp/dspy.git
````
## 📜 Citation & Reading More
**[Jun'24] [Optimizing Instructions and Demonstrations for Multi-Stage Language Model Programs](https://arxiv.org/abs/2406.11695)**
**[Oct'23] [DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines](https://arxiv.org/abs/2310.03714)**
[Jul'24] [Fine-Tuning and Prompt Optimization: Two Great Steps that Work Better Together](https://arxiv.org/abs/2407.10930)
[Jun'24] [Prompts as Auto-Optimized Training Hyperparameters](https://arxiv.org/abs/2406.11706)
[Feb'24] [Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models](https://arxiv.org/abs/2402.14207)
[Jan'24] [In-Context Learning for Extreme Multi-Label Classification](https://arxiv.org/abs/2401.12178)
[Dec'23] [DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines](https://arxiv.org/abs/2312.13382)
[Dec'22] [Demonstrate-Search-Predict: Composing Retrieval & Language Models for Knowledge-Intensive NLP](https://arxiv.org/abs/2212.14024.pdf)
To stay up to date or learn more, follow [@lateinteraction](https://twitter.com/lateinteraction) on Twitter.
The **DSPy** logo is designed by **Chuyi Zhang**.
If you use DSPy or DSP in a research paper, please cite our work as follows:
```
@inproceedings{khattab2024dspy,
title={DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines},
author={Khattab, Omar and Singhvi, Arnav and Maheshwari, Paridhi and Zhang, Zhiyuan and Santhanam, Keshav and Vardhamanan, Sri and Haq, Saiful and Sharma, Ashutosh and Joshi, Thomas T. and Moazam, Hanna and Miller, Heather and Zaharia, Matei and Potts, Christopher},
journal={The Twelfth International Conference on Learning Representations},
year={2024}
}
@article{khattab2022demonstrate,
title={Demonstrate-Search-Predict: Composing Retrieval and Language Models for Knowledge-Intensive {NLP}},
author={Khattab, Omar and Santhanam, Keshav and Li, Xiang Lisa and Hall, David and Liang, Percy and Potts, Christopher and Zaharia, Matei},
journal={arXiv preprint arXiv:2212.14024},
year={2022}
}
```
<!-- You can also read more about the evolution of the framework from Demonstrate-Search-Predict to DSPy:
* [**DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines**](https://arxiv.org/abs/2312.13382) (Academic Paper, Dec 2023)
* [**DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines**](https://arxiv.org/abs/2310.03714) (Academic Paper, Oct 2023)
* [**Releasing DSPy, the latest iteration of the framework**](https://twitter.com/lateinteraction/status/1694748401374490946) (Twitter Thread, Aug 2023)
* [**Releasing the DSP Compiler (v0.1)**](https://twitter.com/lateinteraction/status/1625231662849073160) (Twitter Thread, Feb 2023)
* [**Introducing DSP**](https://twitter.com/lateinteraction/status/1617953413576425472) (Twitter Thread, Jan 2023)
* [**Demonstrate-Search-Predict: Composing retrieval and language models for knowledge-intensive NLP**](https://arxiv.org/abs/2212.14024.pdf) (Academic Paper, Dec 2022) -->
Raw data
{
"_id": null,
"home_page": "https://github.com/stanfordnlp/dsp",
"name": "dspy",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Omar Khattab",
"author_email": "okhattab@stanford.edu",
"download_url": "https://files.pythonhosted.org/packages/4a/67/2f28b07508556d146db928f4fd31882bf4d0a9bcc39a18d9165dd064fdc5/dspy-2.5.31.tar.gz",
"platform": null,
"description": "<p align=\"center\">\n <img align=\"center\" src=\"docs/docs/static/img/dspy_logo.png\" width=\"460px\" />\n</p>\n<p align=\"left\">\n\n\n## DSPy: _Programming_\u2014not prompting\u2014Foundation Models\n\n**Documentation:** [DSPy Docs](https://dspy.ai/)\n\n[![Downloads](https://static.pepy.tech/badge/dspy-ai)](https://pepy.tech/project/dspy-ai) [![Downloads](https://static.pepy.tech/badge/dspy-ai/month)](https://pepy.tech/project/dspy-ai)\n\n\n----\n\nDSPy is the open-source framework for **building high-quality, modular AI systems** by _programming\u2014rather than prompting\u2014language models_. It provides abstractions and algorithms for **optimizing the prompts and weights** in LM programs, ranging from simple classifiers to sophisticated RAG pipelines and Agent loops.\n\nInstead of writing brittle LM-specific prompts, you write compositional code and use DSPy optimizers to teach different models like `GPT-4o` or `Llama-3.2` to **deliver higher quality outputs** or avoid specific failure patterns. In essence, DSPy optimizers then _compile your high-level code_ into low-level computations, prompts, or weight updates that **align your LM with your program\u2019s structure and metrics**.\n\n\nDSPy stands for Declarative Self-improving Python. This [recent lecture](https://www.youtube.com/watch?v=JEMYuzrKLUw) is a good conceptual introduction. Our [Discord server](https://discord.gg/XCGy2WDCQB) is a great place to meet the community, seek help, or start contributing.\n\n\n\n## Documentation: [dspy.ai](https://dspy.ai)\n\n\n**Please go to the [DSPy Docs at dspy.ai](https://dspy.ai)**\n\n\n## Installation\n\n\n```bash\npip install dspy\n```\n\nTo install the very latest from `main`:\n\n```bash\npip install git+https://github.com/stanfordnlp/dspy.git\n````\n\n\n\n\n## \ud83d\udcdc Citation & Reading More\n\n**[Jun'24] [Optimizing Instructions and Demonstrations for Multi-Stage Language Model Programs](https://arxiv.org/abs/2406.11695)** \n**[Oct'23] [DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines](https://arxiv.org/abs/2310.03714)** \n[Jul'24] [Fine-Tuning and Prompt Optimization: Two Great Steps that Work Better Together](https://arxiv.org/abs/2407.10930) \n[Jun'24] [Prompts as Auto-Optimized Training Hyperparameters](https://arxiv.org/abs/2406.11706) \n[Feb'24] [Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models](https://arxiv.org/abs/2402.14207) \n[Jan'24] [In-Context Learning for Extreme Multi-Label Classification](https://arxiv.org/abs/2401.12178) \n[Dec'23] [DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines](https://arxiv.org/abs/2312.13382) \n[Dec'22] [Demonstrate-Search-Predict: Composing Retrieval & Language Models for Knowledge-Intensive NLP](https://arxiv.org/abs/2212.14024.pdf)\n\nTo stay up to date or learn more, follow [@lateinteraction](https://twitter.com/lateinteraction) on Twitter.\n\nThe **DSPy** logo is designed by **Chuyi Zhang**.\n\nIf you use DSPy or DSP in a research paper, please cite our work as follows:\n\n```\n@inproceedings{khattab2024dspy,\n title={DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines},\n author={Khattab, Omar and Singhvi, Arnav and Maheshwari, Paridhi and Zhang, Zhiyuan and Santhanam, Keshav and Vardhamanan, Sri and Haq, Saiful and Sharma, Ashutosh and Joshi, Thomas T. and Moazam, Hanna and Miller, Heather and Zaharia, Matei and Potts, Christopher},\n journal={The Twelfth International Conference on Learning Representations},\n year={2024}\n}\n@article{khattab2022demonstrate,\n title={Demonstrate-Search-Predict: Composing Retrieval and Language Models for Knowledge-Intensive {NLP}},\n author={Khattab, Omar and Santhanam, Keshav and Li, Xiang Lisa and Hall, David and Liang, Percy and Potts, Christopher and Zaharia, Matei},\n journal={arXiv preprint arXiv:2212.14024},\n year={2022}\n}\n```\n\n<!-- You can also read more about the evolution of the framework from Demonstrate-Search-Predict to DSPy:\n\n* [**DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines**](https://arxiv.org/abs/2312.13382) (Academic Paper, Dec 2023) \n* [**DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines**](https://arxiv.org/abs/2310.03714) (Academic Paper, Oct 2023) \n* [**Releasing DSPy, the latest iteration of the framework**](https://twitter.com/lateinteraction/status/1694748401374490946) (Twitter Thread, Aug 2023)\n* [**Releasing the DSP Compiler (v0.1)**](https://twitter.com/lateinteraction/status/1625231662849073160) (Twitter Thread, Feb 2023)\n* [**Introducing DSP**](https://twitter.com/lateinteraction/status/1617953413576425472) (Twitter Thread, Jan 2023)\n* [**Demonstrate-Search-Predict: Composing retrieval and language models for knowledge-intensive NLP**](https://arxiv.org/abs/2212.14024.pdf) (Academic Paper, Dec 2022) -->\n\n\n\n",
"bugtrack_url": null,
"license": "MIT License",
"summary": "DSPy",
"version": "2.5.31",
"project_urls": {
"Homepage": "https://github.com/stanfordnlp/dsp"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "088514a830ae5ab5b33766590a8b86cd8b97a8fafca4e16aa24dc20a5720200c",
"md5": "b75acebf27119f046afe56d30bc94764",
"sha256": "588e177873d9497284b4a32cdee8c18db25ed7ac2b498ed713a5ee8afeb64be4"
},
"downloads": -1,
"filename": "dspy-2.5.31-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b75acebf27119f046afe56d30bc94764",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 333634,
"upload_time": "2024-11-17T14:32:53",
"upload_time_iso_8601": "2024-11-17T14:32:53.648351Z",
"url": "https://files.pythonhosted.org/packages/08/85/14a830ae5ab5b33766590a8b86cd8b97a8fafca4e16aa24dc20a5720200c/dspy-2.5.31-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4a672f28b07508556d146db928f4fd31882bf4d0a9bcc39a18d9165dd064fdc5",
"md5": "da2fd1c0394bf3df93487c93c4ab9426",
"sha256": "34291edac77d77e0d295b43ba3eeb2a7a8d061180360f0c49b571ef086ca8383"
},
"downloads": -1,
"filename": "dspy-2.5.31.tar.gz",
"has_sig": false,
"md5_digest": "da2fd1c0394bf3df93487c93c4ab9426",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 253265,
"upload_time": "2024-11-17T14:32:55",
"upload_time_iso_8601": "2024-11-17T14:32:55.604648Z",
"url": "https://files.pythonhosted.org/packages/4a/67/2f28b07508556d146db928f4fd31882bf4d0a9bcc39a18d9165dd064fdc5/dspy-2.5.31.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-17 14:32:55",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "stanfordnlp",
"github_project": "dsp",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "dspy"
}