ontogpt


Nameontogpt JSON
Version 1.0.18 PyPI version JSON
download
home_pageNone
SummaryOntoGPT is a Python package for extracting structured information from text with large language models (LLMs), instruction prompts, and ontology-based grounding.
upload_time2025-09-18 16:36:42
maintainerNone
docs_urlNone
authorNone
requires_python!=3.9.7,<3.14,>=3.9
licenseBSD-3
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OntoGPT

![OntoGPT Logo](/images/ontogpt_logo_3.jpg)

[![DOI](https://zenodo.org/badge/13996/monarch-initiative/ontogpt.svg)](https://zenodo.org/badge/latestdoi/13996/monarch-initiative/ontogpt)
![PyPI](https://img.shields.io/pypi/v/ontogpt)

## Introduction

_OntoGPT_ is a Python package for extracting structured information from text with large language models (LLMs), _instruction prompts_, and ontology-based grounding.

[For more details, please see the full documentation.](https://monarch-initiative.github.io/ontogpt/)

## Quick Start

OntoGPT runs on the command line, though there's also a minimal web app interface (see `Web Application` section below).

1. Ensure you have Python 3.9 or greater installed.
2. Install with `pip`:

    ```bash
    pip install ontogpt
    ```

3. Set your OpenAI API key:

    ```bash
    runoak set-apikey -e openai <your openai api key>
    ```

4. See the list of all OntoGPT commands:

    ```bash
    ontogpt --help
    ```

5. Try a simple example of information extraction:

    ```bash
    echo "One treatment for high blood pressure is carvedilol." > example.txt
    ontogpt extract -i example.txt -t drug
    ```

    OntoGPT will retrieve the necessary ontologies and output results to the command line. Your output will provide all extracted objects under the heading `extracted_object`.

## Web Application

There is a bare bones web application for running OntoGPT and viewing results.

First, install the required dependencies with `pip` by running the following command:

```bash
pip install ontogpt[web]
```

Then run this command to start the web application:

```bash
web-ontogpt
```

NOTE: We do not recommend hosting this webapp publicly without authentication.

## Model APIs

OntoGPT uses the `litellm` package (<https://litellm.vercel.app/>) to interface with LLMs.

This means most APIs are supported, including OpenAI, Azure, Anthropic, Mistral, Replicate, and beyond.

The model name to use may be found from the command `ontogpt list-models` - use the name in the first column with the `--model` option.

In most cases, this will require setting the API key for a particular service as above:

```bash
runoak set-apikey -e anthropic-key <your anthropic api key>
```

Some endpoints, such as OpenAI models through Azure, require setting additional details. These may be set similarly:

```bash
runoak set-apikey -e azure-key <your azure api key>
runoak set-apikey -e azure-base <your azure endpoint url>
runoak set-apikey -e azure-version <your azure api version, e.g. "2023-05-15">
```

These details may also be set as environment variables as follows:

```bash
export AZURE_API_KEY="my-azure-api-key"
export AZURE_API_BASE="https://example-endpoint.openai.azure.com"
export AZURE_API_VERSION="2023-05-15"
```

## Open Models

Open LLMs may be retrieved and run through the `ollama` package (<https://ollama.com/>).

You will need to install `ollama` (see the [GitHub repo](https://github.com/ollama/ollama)), and you may need to start it as a service with a command like `ollama serve` or `sudo systemctl start ollama`.

Then retrieve a model with `ollama pull <modelname>`, e.g., `ollama pull llama3`.

The model may then be used in OntoGPT by prefixing its name with `ollama/`, e.g., `ollama/llama3`, along with the `--model` option.

Some ollama models may not be listed in `ontogpt list-models` but the full list of downloaded LLMs can be seen with `ollama list` command.

## Evaluations

OntoGPT's functions have been evaluated on test data. Please see the full documentation for details on these evaluations and how to reproduce them.

## Related Projects

* [TALISMAN](https://github.com/monarch-initiative/talisman/), a tool for generating summaries of functions enriched within a gene set. TALISMAN uses OntoGPT to work with LLMs.

## Tutorials and Presentations

* Presentation: "Staying grounded: assembling structured biological knowledge with help from large language models" - presented by Harry Caufield as part of the AgBioData Consortium webinar series (September 2023)
  * [Slides](https://docs.google.com/presentation/d/1rMQVWaMju-ucYFif5nx4Xv3bNX2SVI_w89iBIT1bkV4/edit?usp=sharing)
  * [Video](https://www.youtube.com/watch?v=z38lI6WyBsY)
* Presentation: "Transforming unstructured biomedical texts with large language models" - presented by Harry Caufield as part of the BOSC track at ISMB/ECCB 2023 (July 2023)
  * [Slides](https://docs.google.com/presentation/d/1LsOTKi-rXYczL9vUTHB1NDkaEqdA9u3ZFC5ANa0x1VU/edit?usp=sharing)
  * [Video](https://www.youtube.com/watch?v=a34Yjz5xPp4)
* Presentation: "OntoGPT: A framework for working with ontologies and large language models" - talk by Chris Mungall at Joint Food Ontology Workgroup (May 2023)
  * [Slides](https://docs.google.com/presentation/d/1CosJJe8SqwyALyx85GWkw9eOT43B4HwDlAY2CmkmJgU/edit)
  * [Video](https://www.youtube.com/watch?v=rt3wobA9hEs&t=1955s)

## Citation

The information extraction approach used in OntoGPT, SPIRES, is described further in: Caufield JH, Hegde H, Emonet V, Harris NL, Joachimiak MP, Matentzoglu N, et al. Structured prompt interrogation and recursive extraction of semantics (SPIRES): A method for populating knowledge bases using zero-shot learning. _Bioinformatics_, Volume 40, Issue 3, March 2024, btae104, [https://doi.org/10.1093/bioinformatics/btae104](https://doi.org/10.1093/bioinformatics/btae104).

## Acknowledgements

This project is part of the [Monarch Initiative](https://monarchinitiative.org/). We also gratefully acknowledge [Bosch Research](https://www.bosch.com/research) for their support of this research project.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ontogpt",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "!=3.9.7,<3.14,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Chris Mungall <cjmungall@lbl.gov>, \"J. Harry Caufield\" <jhc@lbl.gov>",
    "download_url": "https://files.pythonhosted.org/packages/01/64/1a15343532c84797307886d82807af6015de7ab9562f80ec33df49c4bec2/ontogpt-1.0.18.tar.gz",
    "platform": null,
    "description": "# OntoGPT\n\n![OntoGPT Logo](/images/ontogpt_logo_3.jpg)\n\n[![DOI](https://zenodo.org/badge/13996/monarch-initiative/ontogpt.svg)](https://zenodo.org/badge/latestdoi/13996/monarch-initiative/ontogpt)\n![PyPI](https://img.shields.io/pypi/v/ontogpt)\n\n## Introduction\n\n_OntoGPT_ is a Python package for extracting structured information from text with large language models (LLMs), _instruction prompts_, and ontology-based grounding.\n\n[For more details, please see the full documentation.](https://monarch-initiative.github.io/ontogpt/)\n\n## Quick Start\n\nOntoGPT runs on the command line, though there's also a minimal web app interface (see `Web Application` section below).\n\n1. Ensure you have Python 3.9 or greater installed.\n2. Install with `pip`:\n\n    ```bash\n    pip install ontogpt\n    ```\n\n3. Set your OpenAI API key:\n\n    ```bash\n    runoak set-apikey -e openai <your openai api key>\n    ```\n\n4. See the list of all OntoGPT commands:\n\n    ```bash\n    ontogpt --help\n    ```\n\n5. Try a simple example of information extraction:\n\n    ```bash\n    echo \"One treatment for high blood pressure is carvedilol.\" > example.txt\n    ontogpt extract -i example.txt -t drug\n    ```\n\n    OntoGPT will retrieve the necessary ontologies and output results to the command line. Your output will provide all extracted objects under the heading `extracted_object`.\n\n## Web Application\n\nThere is a bare bones web application for running OntoGPT and viewing results.\n\nFirst, install the required dependencies with `pip` by running the following command:\n\n```bash\npip install ontogpt[web]\n```\n\nThen run this command to start the web application:\n\n```bash\nweb-ontogpt\n```\n\nNOTE: We do not recommend hosting this webapp publicly without authentication.\n\n## Model APIs\n\nOntoGPT uses the `litellm` package (<https://litellm.vercel.app/>) to interface with LLMs.\n\nThis means most APIs are supported, including OpenAI, Azure, Anthropic, Mistral, Replicate, and beyond.\n\nThe model name to use may be found from the command `ontogpt list-models` - use the name in the first column with the `--model` option.\n\nIn most cases, this will require setting the API key for a particular service as above:\n\n```bash\nrunoak set-apikey -e anthropic-key <your anthropic api key>\n```\n\nSome endpoints, such as OpenAI models through Azure, require setting additional details. These may be set similarly:\n\n```bash\nrunoak set-apikey -e azure-key <your azure api key>\nrunoak set-apikey -e azure-base <your azure endpoint url>\nrunoak set-apikey -e azure-version <your azure api version, e.g. \"2023-05-15\">\n```\n\nThese details may also be set as environment variables as follows:\n\n```bash\nexport AZURE_API_KEY=\"my-azure-api-key\"\nexport AZURE_API_BASE=\"https://example-endpoint.openai.azure.com\"\nexport AZURE_API_VERSION=\"2023-05-15\"\n```\n\n## Open Models\n\nOpen LLMs may be retrieved and run through the `ollama` package (<https://ollama.com/>).\n\nYou will need to install `ollama` (see the [GitHub repo](https://github.com/ollama/ollama)), and you may need to start it as a service with a command like `ollama serve` or `sudo systemctl start ollama`.\n\nThen retrieve a model with `ollama pull <modelname>`, e.g., `ollama pull llama3`.\n\nThe model may then be used in OntoGPT by prefixing its name with `ollama/`, e.g., `ollama/llama3`, along with the `--model` option.\n\nSome ollama models may not be listed in `ontogpt list-models` but the full list of downloaded LLMs can be seen with `ollama list` command.\n\n## Evaluations\n\nOntoGPT's functions have been evaluated on test data. Please see the full documentation for details on these evaluations and how to reproduce them.\n\n## Related Projects\n\n* [TALISMAN](https://github.com/monarch-initiative/talisman/), a tool for generating summaries of functions enriched within a gene set. TALISMAN uses OntoGPT to work with LLMs.\n\n## Tutorials and Presentations\n\n* Presentation: \"Staying grounded: assembling structured biological knowledge with help from large language models\" - presented by Harry Caufield as part of the AgBioData Consortium webinar series (September 2023)\n  * [Slides](https://docs.google.com/presentation/d/1rMQVWaMju-ucYFif5nx4Xv3bNX2SVI_w89iBIT1bkV4/edit?usp=sharing)\n  * [Video](https://www.youtube.com/watch?v=z38lI6WyBsY)\n* Presentation: \"Transforming unstructured biomedical texts with large language models\" - presented by Harry Caufield as part of the BOSC track at ISMB/ECCB 2023 (July 2023)\n  * [Slides](https://docs.google.com/presentation/d/1LsOTKi-rXYczL9vUTHB1NDkaEqdA9u3ZFC5ANa0x1VU/edit?usp=sharing)\n  * [Video](https://www.youtube.com/watch?v=a34Yjz5xPp4)\n* Presentation: \"OntoGPT: A framework for working with ontologies and large language models\" - talk by Chris Mungall at Joint Food Ontology Workgroup (May 2023)\n  * [Slides](https://docs.google.com/presentation/d/1CosJJe8SqwyALyx85GWkw9eOT43B4HwDlAY2CmkmJgU/edit)\n  * [Video](https://www.youtube.com/watch?v=rt3wobA9hEs&t=1955s)\n\n## Citation\n\nThe information extraction approach used in OntoGPT, SPIRES, is described further in: Caufield JH, Hegde H, Emonet V, Harris NL, Joachimiak MP, Matentzoglu N, et al. Structured prompt interrogation and recursive extraction of semantics (SPIRES): A method for populating knowledge bases using zero-shot learning. _Bioinformatics_, Volume 40, Issue 3, March 2024, btae104, [https://doi.org/10.1093/bioinformatics/btae104](https://doi.org/10.1093/bioinformatics/btae104).\n\n## Acknowledgements\n\nThis project is part of the [Monarch Initiative](https://monarchinitiative.org/). We also gratefully acknowledge [Bosch Research](https://www.bosch.com/research) for their support of this research project.\n",
    "bugtrack_url": null,
    "license": "BSD-3",
    "summary": "OntoGPT is a Python package for extracting structured information from text with large language models (LLMs), instruction prompts, and ontology-based grounding.",
    "version": "1.0.18",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "30d5476bc188c7773121e8831379aab8d723b37cb298d08fca5a274e2ba6ac8d",
                "md5": "7abc1330830c5503691b98cbb68bd6d2",
                "sha256": "35ee21160a606a30eebcf0bd89abe939ec6018acc6851eee032851ea27436a01"
            },
            "downloads": -1,
            "filename": "ontogpt-1.0.18-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7abc1330830c5503691b98cbb68bd6d2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "!=3.9.7,<3.14,>=3.9",
            "size": 436719,
            "upload_time": "2025-09-18T16:36:41",
            "upload_time_iso_8601": "2025-09-18T16:36:41.215821Z",
            "url": "https://files.pythonhosted.org/packages/30/d5/476bc188c7773121e8831379aab8d723b37cb298d08fca5a274e2ba6ac8d/ontogpt-1.0.18-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "01641a15343532c84797307886d82807af6015de7ab9562f80ec33df49c4bec2",
                "md5": "b1143eaf0db7ce0c01d1986dda1f9af2",
                "sha256": "accbe9f08f8431f1c12fa73d654dbc3b4b421f6af285ae5caa13da82e00cad43"
            },
            "downloads": -1,
            "filename": "ontogpt-1.0.18.tar.gz",
            "has_sig": false,
            "md5_digest": "b1143eaf0db7ce0c01d1986dda1f9af2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "!=3.9.7,<3.14,>=3.9",
            "size": 284705,
            "upload_time": "2025-09-18T16:36:42",
            "upload_time_iso_8601": "2025-09-18T16:36:42.981918Z",
            "url": "https://files.pythonhosted.org/packages/01/64/1a15343532c84797307886d82807af6015de7ab9562f80ec33df49c4bec2/ontogpt-1.0.18.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-18 16:36:42",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ontogpt"
}
        
Elapsed time: 2.68327s