brickllm


Namebrickllm JSON
Version 1.2.0 PyPI version JSON
download
home_pagehttps://brickllm.com/
SummaryLibrary for generating RDF files following BrickSchema ontology using LLM
upload_time2024-11-11 11:42:25
maintainerNone
docs_urlNone
authorMarco Perini
requires_python<4.0,>=3.9
licenseBSD-3-Clause
keywords brickllm brickschema rdf ontologies knowledge graph semantic web ai artificial intelligence gpt machine learning natural language processing nlp openai building automation iot graph ontology
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
  <img src="https://raw.githubusercontent.com/EURAC-EEBgroup/brick-llm/refs/heads/main/docs/assets/brickllm_banner.png" alt="BrickLLM" style="width: 100%;">
</p>

# ๐Ÿงฑ BrickLLM

BrickLLM is a Python library for generating RDF files following the BrickSchema ontology using Large Language Models (LLMs).

## ๐Ÿงฐ Features

- Generate BrickSchema-compliant RDF files from natural language descriptions of buildings and facilities
- Support for multiple LLM providers (OpenAI, Anthropic, Fireworks)
- Customizable graph execution with LangGraph
- Easy-to-use API for integrating with existing projects

## ๐Ÿ’ป Installation

You can install BrickLLM using pip:

``` bash
pip install brickllm
```

<details>
<summary><b>Development Installation</b></summary>

[Poetry](https://python-poetry.org/) is used for dependency management during development. To install BrickLLM for contributing, follow these steps:

``` bash
# Clone the repository
git clone https://github.com/EURAC-EEBgroup/brickllm-lib.git
cd brick-llm

# Create a virtual environment
python -m venv .venv

# Activate the virtual environment
source .venv/bin/activate # Linux/Mac
.venv\Scripts\activate # Windows

# Install Poetry and dependencies
pip install poetry
poetry install

# Install pre-commit hooks
pre-commit install
```

</details>

## ๐Ÿš€ Quick Start

Here's a simple example of how to use BrickLLM:

``` python
from brickllm.graphs import BrickSchemaGraph

building_description = """
I have a building located in Bolzano.
It has 3 floors and each floor has 1 office.
There are 2 rooms in each office and each room has three sensors:
- Temperature sensor;
- Humidity sensor;
- CO sensor.
"""

# Create an instance of BrickSchemaGraph with a predefined provider
brick_graph = BrickSchemaGraph(model="openai")

# Display the graph structure
brick_graph.display()

# Prepare input data
input_data = {
    "user_prompt": building_description
}

# Run the graph
result = brick_graph.run(input_data=input_data, stream=False)

# Print the result
print(result)

# save the result to a file
brick_graph.save_ttl_output("my_building.ttl")
```

<details>
<summary><b>Using Custom LLM Models</b></summary>

BrickLLM supports using custom LLM models. Here's an example using OpenAI's GPT-4o:

``` python
from brickllm.graphs import BrickSchemaGraph
from langchain_openai import ChatOpenAI

custom_model = ChatOpenAI(temperature=0, model="gpt-4o")
brick_graph = BrickSchemaGraph(model=custom_model)

# Prepare input data
input_data = {
    "user_prompt": building_description
}

# Run the graph with the custom model
result = brick_graph.run(input_data=input_data, stream=False)
```
</details>

<details>
<summary><b>Using Local LLM Models</b></summary>
<p>BrickLLM supports using local LLM models employing the <a href="https://ollama.com/">Ollama framework</a>. Currently, only our finetuned model is supported.</p>

### Option 1: Using Docker Compose

You can easily set up and run the Ollama environment using Docker Compose. The finetuned model file will be automatically downloaded inside the container. Follow these steps:

1. Clone the repository and navigate to the `finetuned` directory containing the `Dockerfile` and `docker-compose.yml`.
2. Run the following command to build and start the container:
    ```bash
    docker-compose up --build -d
    ```
3. Verify that the docker is running on localhost:11434:
   ```bash
   docker ps
   ```
   if result is:
   ```
   CONTAINER ID   IMAGE                         COMMAND                  CREATED          STATUS          PORTS                     NAMES
   1e9bff7c2f7b   finetuned-ollama-llm:latest   "/entrypoint.sh"         42 minutes ago   Up 42 minutes   11434/tcp                 compassionate_wing
   ```

   so run the docker image specifying the port:
   ```bash
   docker run -d -p 11434:11434 finetuned-ollama-llm:latest
   docker ps
   ```

   the result will be like:
   ```
   CONTAINER ID   IMAGE                         COMMAND                  CREATED         STATUS          PORTS                      NAMES
   df8b31d4ed86   finetuned-ollama-llm:latest   "/entrypoint.sh"         7 seconds ago   Up 7 seconds    0.0.0.0:11434->11434/tcp   eloquent_jennings
   ```
   check if ollama is runnin in the port 11434:
   ```
   curl http://localhost:11434
   ```
   Result should be:
   ```
   Ollama is running
   ```
This will download the model file, create the model in Ollama, and serve it on port `11434`. The necessary directories will be created automatically.

### Option 2: Manual Setup

If you prefer to set up the model manually, follow these steps:

1. Download the `.gguf` file from <a href="https://huggingface.co/Giudice7/llama31-8B-brick-v8/tree/main">here</a>.
2. Create a file named `Modelfile` with the following content:
    ```bash
    FROM ./unsloth.Q4_K_M.gguf
    ```

3. Place the downloaded `.gguf` file in the same folder as the `Modelfile`.
4. Ensure Ollama is running on your system.
5. Run the following command to create the model in Ollama:
    ```bash
    ollama create llama3.1:8b-brick-v8 -f Modelfile
    ```

Once you've set up the model in Ollama, you can use it in your code as follows:

``` python
from brickllm.graphs import BrickSchemaGraphLocal

instructions = """
Your job is to generate a RDF graph in Turtle format from a description of energy systems and sensors of a building in the following input, using the Brick ontology.
### Instructions:
- Each subject, object of predicate must start with a @prefix.
- Use the prefix bldg: with IRI <http://my-bldg#> for any created entities.
- Use the prefix brick: with IRI <https://brickschema.org/schema/Brick#> for any Brick entities and relationships used.
- Use the prefix unit: with IRI <http://qudt.org/vocab/unit/> and its ontology for any unit of measure defined.
- When encoding the timeseries ID of the sensor, you must use the following format: ref:hasExternalReference [ a ref:TimeseriesReference ; ref:hasTimeseriesId 'timeseriesID' ].
- When encoding identifiers or external references, such as building/entities IDs, use the following schema: ref:hasExternalReference [ a ref:ExternalReference ; ref:hasExternalReference โ€˜id/referenceโ€™ ].
- When encoding numerical reference, use the schema [brick:value 'value' ; \n brick:hasUnit unit:'unit' ] .
-When encoding coordinates, use the schema brick:coordinates [brick:latitude "lat" ; brick:longitude "long" ].
The response must be the RDF graph that includes all the @prefix of the ontologies used in the triples. The RDF graph must be created in Turtle format. Do not add any other text or comment to the response.
"""

building_description = """
The building (external ref: 'OB103'), with coordinates 33.9614, -118.3531, has a total area of 500 mยฒ. It has three zones, each with its own air temperature sensor.
The building has an electrical meter that monitors data of a power sensor. An HVAC equipment serves all three zones and its power usage is measured by a power sensor.

Timeseries IDs and unit of measure of the sensors:
- Building power consumption: '1b3e-29dk-8js7-f54v' in watts.
- HVAC power consumption: '29dh-8ks3-fvjs-d92e' in watts.
- Temperature sensor zone 1: 't29s-jk83-kv82-93fs' in celsius.
- Temperature sensor zone 2: 'f29g-js92-df73-l923' in celsius.
- Temperature sensor zone 3: 'm93d-ljs9-83ks-29dh' in celsius.
"""

# Create an instance of BrickSchemaGraphLocal
brick_graph_local = BrickSchemaGraphLocal(model="llama3.1:8b-brick")

# Display the graph structure
brick_graph_local.display()

# Prepare input data
input_data = {
    "user_prompt": building_description,
    "instructions": instructions
}

# Run the graph
result = brick_graph_local.run(input_data=input_data, stream=False)

# Print the result
print(result)

# Save the result to a file
brick_graph_local.save_ttl_output("my_building_local.ttl")
```
</details>

## ๐Ÿ“– Documentation

For more detailed information on how to use BrickLLM, please refer to our [documentation](https://eurac-eebgroup.github.io/brick-llm/).

## โ–ถ๏ธ Web Application

A web app is available to use the library directly through an interface at the following link ().
The application can also be used locally as described in the dedicated repository [BrickLLM App](https://github.com/EURAC-EEBgroup/Brick_ontology_tool).

**Note**: The tool is currently being deployed on our servers and on the MODERATE platform. It will be online shortly !

## ๐Ÿค Contributing

We welcome contributions to BrickLLM! Please see our [contributing guidelines](CONTRIBUTING.md) for more information.

## ๐Ÿ“œ License

BrickLLM is released under the BSD-3-Clause License. See the [LICENSE](LICENSE) file for details.

## ๐Ÿ“ง Contact

For any questions or support, please contact:

- Marco Perini <marco.perini@eurac.edu>
- Daniele Antonucci <daniele.antonucci@eurac.edu>
- Rocco Giudice <rocco.giudice@polito.it>

## ๐Ÿ“ Citation

Please cite us if you use the library

[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.14039358.svg)](https://zenodo.org/doi/10.5281/zenodo.14039358)

## ๐Ÿ’™ Acknowledgements
This work was carried out within European projects:

<p align="center">
  <img src="https://raw.githubusercontent.com/EURAC-EEBgroup/brick-llm/refs/heads/main/docs/assets/moderate_logo.png" alt="Moderate"
</p>

Moderate - Horizon Europe research and innovation programme under grant agreement No 101069834, with the aim of contributing to the development of open products useful for defining plausible scenarios for the decarbonization of the built environment
BrickLLM is developed and maintained by the Energy Efficiency in Buildings group at EURAC Research. Thanks to the contribution of:
- Moderate project: Horizon Europe research and innovation programme under grant agreement No 101069834
- Politecnico of Turin, in particular to @Rocco Giudice for his work in developing model generation using local language model


            

Raw data

            {
    "_id": null,
    "home_page": "https://brickllm.com/",
    "name": "brickllm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "brickllm, brickschema, rdf, ontologies, knowledge graph, semantic web, ai, artificial intelligence, gpt, machine learning, natural language processing, nlp, openai, building automation, iot, graph, ontology",
    "author": "Marco Perini",
    "author_email": "marco.perini@eurac.edu",
    "download_url": "https://files.pythonhosted.org/packages/22/a3/e071bbb4cbeb4a739531ce9445d8edc4f86a84578ee77f560c78a0a4b2a8/brickllm-1.2.0.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/EURAC-EEBgroup/brick-llm/refs/heads/main/docs/assets/brickllm_banner.png\" alt=\"BrickLLM\" style=\"width: 100%;\">\n</p>\n\n# \ud83e\uddf1 BrickLLM\n\nBrickLLM is a Python library for generating RDF files following the BrickSchema ontology using Large Language Models (LLMs).\n\n## \ud83e\uddf0 Features\n\n- Generate BrickSchema-compliant RDF files from natural language descriptions of buildings and facilities\n- Support for multiple LLM providers (OpenAI, Anthropic, Fireworks)\n- Customizable graph execution with LangGraph\n- Easy-to-use API for integrating with existing projects\n\n## \ud83d\udcbb Installation\n\nYou can install BrickLLM using pip:\n\n``` bash\npip install brickllm\n```\n\n<details>\n<summary><b>Development Installation</b></summary>\n\n[Poetry](https://python-poetry.org/) is used for dependency management during development. To install BrickLLM for contributing, follow these steps:\n\n``` bash\n# Clone the repository\ngit clone https://github.com/EURAC-EEBgroup/brickllm-lib.git\ncd brick-llm\n\n# Create a virtual environment\npython -m venv .venv\n\n# Activate the virtual environment\nsource .venv/bin/activate # Linux/Mac\n.venv\\Scripts\\activate # Windows\n\n# Install Poetry and dependencies\npip install poetry\npoetry install\n\n# Install pre-commit hooks\npre-commit install\n```\n\n</details>\n\n## \ud83d\ude80 Quick Start\n\nHere's a simple example of how to use BrickLLM:\n\n``` python\nfrom brickllm.graphs import BrickSchemaGraph\n\nbuilding_description = \"\"\"\nI have a building located in Bolzano.\nIt has 3 floors and each floor has 1 office.\nThere are 2 rooms in each office and each room has three sensors:\n- Temperature sensor;\n- Humidity sensor;\n- CO sensor.\n\"\"\"\n\n# Create an instance of BrickSchemaGraph with a predefined provider\nbrick_graph = BrickSchemaGraph(model=\"openai\")\n\n# Display the graph structure\nbrick_graph.display()\n\n# Prepare input data\ninput_data = {\n    \"user_prompt\": building_description\n}\n\n# Run the graph\nresult = brick_graph.run(input_data=input_data, stream=False)\n\n# Print the result\nprint(result)\n\n# save the result to a file\nbrick_graph.save_ttl_output(\"my_building.ttl\")\n```\n\n<details>\n<summary><b>Using Custom LLM Models</b></summary>\n\nBrickLLM supports using custom LLM models. Here's an example using OpenAI's GPT-4o:\n\n``` python\nfrom brickllm.graphs import BrickSchemaGraph\nfrom langchain_openai import ChatOpenAI\n\ncustom_model = ChatOpenAI(temperature=0, model=\"gpt-4o\")\nbrick_graph = BrickSchemaGraph(model=custom_model)\n\n# Prepare input data\ninput_data = {\n    \"user_prompt\": building_description\n}\n\n# Run the graph with the custom model\nresult = brick_graph.run(input_data=input_data, stream=False)\n```\n</details>\n\n<details>\n<summary><b>Using Local LLM Models</b></summary>\n<p>BrickLLM supports using local LLM models employing the <a href=\"https://ollama.com/\">Ollama framework</a>. Currently, only our finetuned model is supported.</p>\n\n### Option 1: Using Docker Compose\n\nYou can easily set up and run the Ollama environment using Docker Compose. The finetuned model file will be automatically downloaded inside the container. Follow these steps:\n\n1. Clone the repository and navigate to the `finetuned` directory containing the `Dockerfile` and `docker-compose.yml`.\n2. Run the following command to build and start the container:\n    ```bash\n    docker-compose up --build -d\n    ```\n3. Verify that the docker is running on localhost:11434:\n   ```bash\n   docker ps\n   ```\n   if result is:\n   ```\n   CONTAINER ID   IMAGE                         COMMAND                  CREATED          STATUS          PORTS                     NAMES\n   1e9bff7c2f7b   finetuned-ollama-llm:latest   \"/entrypoint.sh\"         42 minutes ago   Up 42 minutes   11434/tcp                 compassionate_wing\n   ```\n\n   so run the docker image specifying the port:\n   ```bash\n   docker run -d -p 11434:11434 finetuned-ollama-llm:latest\n   docker ps\n   ```\n\n   the result will be like:\n   ```\n   CONTAINER ID   IMAGE                         COMMAND                  CREATED         STATUS          PORTS                      NAMES\n   df8b31d4ed86   finetuned-ollama-llm:latest   \"/entrypoint.sh\"         7 seconds ago   Up 7 seconds    0.0.0.0:11434->11434/tcp   eloquent_jennings\n   ```\n   check if ollama is runnin in the port 11434:\n   ```\n   curl http://localhost:11434\n   ```\n   Result should be:\n   ```\n   Ollama is running\n   ```\nThis will download the model file, create the model in Ollama, and serve it on port `11434`. The necessary directories will be created automatically.\n\n### Option 2: Manual Setup\n\nIf you prefer to set up the model manually, follow these steps:\n\n1. Download the `.gguf` file from <a href=\"https://huggingface.co/Giudice7/llama31-8B-brick-v8/tree/main\">here</a>.\n2. Create a file named `Modelfile` with the following content:\n    ```bash\n    FROM ./unsloth.Q4_K_M.gguf\n    ```\n\n3. Place the downloaded `.gguf` file in the same folder as the `Modelfile`.\n4. Ensure Ollama is running on your system.\n5. Run the following command to create the model in Ollama:\n    ```bash\n    ollama create llama3.1:8b-brick-v8 -f Modelfile\n    ```\n\nOnce you've set up the model in Ollama, you can use it in your code as follows:\n\n``` python\nfrom brickllm.graphs import BrickSchemaGraphLocal\n\ninstructions = \"\"\"\nYour job is to generate a RDF graph in Turtle format from a description of energy systems and sensors of a building in the following input, using the Brick ontology.\n### Instructions:\n- Each subject, object of predicate must start with a @prefix.\n- Use the prefix bldg: with IRI <http://my-bldg#> for any created entities.\n- Use the prefix brick: with IRI <https://brickschema.org/schema/Brick#> for any Brick entities and relationships used.\n- Use the prefix unit: with IRI <http://qudt.org/vocab/unit/> and its ontology for any unit of measure defined.\n- When encoding the timeseries ID of the sensor, you must use the following format: ref:hasExternalReference [ a ref:TimeseriesReference ; ref:hasTimeseriesId 'timeseriesID' ].\n- When encoding identifiers or external references, such as building/entities IDs, use the following schema: ref:hasExternalReference [ a ref:ExternalReference ; ref:hasExternalReference \u2018id/reference\u2019 ].\n- When encoding numerical reference, use the schema [brick:value 'value' ; \\n brick:hasUnit unit:'unit' ] .\n-When encoding coordinates, use the schema brick:coordinates [brick:latitude \"lat\" ; brick:longitude \"long\" ].\nThe response must be the RDF graph that includes all the @prefix of the ontologies used in the triples. The RDF graph must be created in Turtle format. Do not add any other text or comment to the response.\n\"\"\"\n\nbuilding_description = \"\"\"\nThe building (external ref: 'OB103'), with coordinates 33.9614, -118.3531, has a total area of 500 m\u00b2. It has three zones, each with its own air temperature sensor.\nThe building has an electrical meter that monitors data of a power sensor. An HVAC equipment serves all three zones and its power usage is measured by a power sensor.\n\nTimeseries IDs and unit of measure of the sensors:\n- Building power consumption: '1b3e-29dk-8js7-f54v' in watts.\n- HVAC power consumption: '29dh-8ks3-fvjs-d92e' in watts.\n- Temperature sensor zone 1: 't29s-jk83-kv82-93fs' in celsius.\n- Temperature sensor zone 2: 'f29g-js92-df73-l923' in celsius.\n- Temperature sensor zone 3: 'm93d-ljs9-83ks-29dh' in celsius.\n\"\"\"\n\n# Create an instance of BrickSchemaGraphLocal\nbrick_graph_local = BrickSchemaGraphLocal(model=\"llama3.1:8b-brick\")\n\n# Display the graph structure\nbrick_graph_local.display()\n\n# Prepare input data\ninput_data = {\n    \"user_prompt\": building_description,\n    \"instructions\": instructions\n}\n\n# Run the graph\nresult = brick_graph_local.run(input_data=input_data, stream=False)\n\n# Print the result\nprint(result)\n\n# Save the result to a file\nbrick_graph_local.save_ttl_output(\"my_building_local.ttl\")\n```\n</details>\n\n## \ud83d\udcd6 Documentation\n\nFor more detailed information on how to use BrickLLM, please refer to our [documentation](https://eurac-eebgroup.github.io/brick-llm/).\n\n## \u25b6\ufe0f Web Application\n\nA web app is available to use the library directly through an interface at the following link ().\nThe application can also be used locally as described in the dedicated repository [BrickLLM App](https://github.com/EURAC-EEBgroup/Brick_ontology_tool).\n\n**Note**: The tool is currently being deployed on our servers and on the MODERATE platform. It will be online shortly !\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions to BrickLLM! Please see our [contributing guidelines](CONTRIBUTING.md) for more information.\n\n## \ud83d\udcdc License\n\nBrickLLM is released under the BSD-3-Clause License. See the [LICENSE](LICENSE) file for details.\n\n## \ud83d\udce7 Contact\n\nFor any questions or support, please contact:\n\n- Marco Perini <marco.perini@eurac.edu>\n- Daniele Antonucci <daniele.antonucci@eurac.edu>\n- Rocco Giudice <rocco.giudice@polito.it>\n\n## \ud83d\udcdd Citation\n\nPlease cite us if you use the library\n\n[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.14039358.svg)](https://zenodo.org/doi/10.5281/zenodo.14039358)\n\n## \ud83d\udc99 Acknowledgements\nThis work was carried out within European projects:\n\n<p align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/EURAC-EEBgroup/brick-llm/refs/heads/main/docs/assets/moderate_logo.png\" alt=\"Moderate\"\n</p>\n\nModerate - Horizon Europe research and innovation programme under grant agreement No 101069834, with the aim of contributing to the development of open products useful for defining plausible scenarios for the decarbonization of the built environment\nBrickLLM is developed and maintained by the Energy Efficiency in Buildings group at EURAC Research. Thanks to the contribution of:\n- Moderate project: Horizon Europe research and innovation programme under grant agreement No 101069834\n- Politecnico of Turin, in particular to @Rocco Giudice for his work in developing model generation using local language model\n\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause",
    "summary": "Library for generating RDF files following BrickSchema ontology using LLM",
    "version": "1.2.0",
    "project_urls": {
        "Documentation": "https://brickllm.com/docs",
        "Homepage": "https://brickllm.com/",
        "Repository": "https://github.com/EURAC-EEBgroup/brickllm-lib"
    },
    "split_keywords": [
        "brickllm",
        " brickschema",
        " rdf",
        " ontologies",
        " knowledge graph",
        " semantic web",
        " ai",
        " artificial intelligence",
        " gpt",
        " machine learning",
        " natural language processing",
        " nlp",
        " openai",
        " building automation",
        " iot",
        " graph",
        " ontology"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "230ec1841c9dd2b8497dc8d2cc9d16b86b42d3bfe61d9174d1a8d7917965094e",
                "md5": "667576fce85bf17c43bc9f08599cfe54",
                "sha256": "ce334df99d3df61bde26d60e3fd12ddbbd5bc807ada8033646657a6ce528d548"
            },
            "downloads": -1,
            "filename": "brickllm-1.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "667576fce85bf17c43bc9f08599cfe54",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 170005,
            "upload_time": "2024-11-11T11:42:23",
            "upload_time_iso_8601": "2024-11-11T11:42:23.602596Z",
            "url": "https://files.pythonhosted.org/packages/23/0e/c1841c9dd2b8497dc8d2cc9d16b86b42d3bfe61d9174d1a8d7917965094e/brickllm-1.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "22a3e071bbb4cbeb4a739531ce9445d8edc4f86a84578ee77f560c78a0a4b2a8",
                "md5": "f01751c64a5c8133de75b0764d2a0389",
                "sha256": "4510577fda9b9aca6263c312e6d0fba8e574298143054c1e5d5ffa4d49f3ec5b"
            },
            "downloads": -1,
            "filename": "brickllm-1.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f01751c64a5c8133de75b0764d2a0389",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 159295,
            "upload_time": "2024-11-11T11:42:25",
            "upload_time_iso_8601": "2024-11-11T11:42:25.232988Z",
            "url": "https://files.pythonhosted.org/packages/22/a3/e071bbb4cbeb4a739531ce9445d8edc4f86a84578ee77f560c78a0a4b2a8/brickllm-1.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-11 11:42:25",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "EURAC-EEBgroup",
    "github_project": "brickllm-lib",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "brickllm"
}
        
Elapsed time: 0.39018s