<!--
<p align="center">
  <img src="https://github.com/data-literacy-alliance/dalia-dif/raw/main/docs/source/logo.png" height="150">
</p>
-->
<h1 align="center">
  DALIA Interaction Format (DIF)
</h1>
<p align="center">
    <a href="https://github.com/data-literacy-alliance/dalia-dif/actions/workflows/tests.yml">
        <img alt="Tests" src="https://github.com/data-literacy-alliance/dalia-dif/actions/workflows/tests.yml/badge.svg" /></a>
    <a href="https://pypi.org/project/dalia_dif">
        <img alt="PyPI" src="https://img.shields.io/pypi/v/dalia_dif" /></a>
    <a href="https://pypi.org/project/dalia_dif">
        <img alt="PyPI - Python Version" src="https://img.shields.io/pypi/pyversions/dalia_dif" /></a>
    <a href="https://github.com/data-literacy-alliance/dalia-dif/blob/main/LICENSE">
        <img alt="PyPI - License" src="https://img.shields.io/pypi/l/dalia_dif" /></a>
    <a href='https://dalia_dif.readthedocs.io/en/latest/?badge=latest'>
        <img src='https://readthedocs.org/projects/dalia_dif/badge/?version=latest' alt='Documentation Status' /></a>
    <a href="https://codecov.io/gh/data-literacy-alliance/dalia-dif/branch/main">
        <img src="https://codecov.io/gh/data-literacy-alliance/dalia-dif/branch/main/graph/badge.svg" alt="Codecov status" /></a>  
    <a href="https://github.com/cthoyt/cookiecutter-python-package">
        <img alt="Cookiecutter template from @cthoyt" src="https://img.shields.io/badge/Cookiecutter-snekpack-blue" /></a>
    <a href="https://github.com/astral-sh/ruff">
        <img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json" alt="Ruff" style="max-width:100%;"></a>
    <a href="https://github.com/data-literacy-alliance/dalia-dif/blob/main/.github/CODE_OF_CONDUCT.md">
        <img src="https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg" alt="Contributor Covenant"/></a>
    <!-- uncomment if you archive on zenodo
    <a href="https://zenodo.org/badge/latestdoi/XXXXXX">
        <img src="https://zenodo.org/badge/XXXXXX.svg" alt="DOI"></a>
    -->
</p>
The DALIA Interaction Format (DIF) v1.3 is the data model and CSV-based input
format for open educational resources (OERs) in the DALIA OER platform.
This repository contains an implementation of the data model in Pydantic, a
workflow for serializing to RDF based on
[pydantic-metamodel](https://github.com/cthoyt/pydantic-metamodel), a CSV
reader, and a command line validator.
A tutorial/guide for curating OERs in a tabular form (CSV) can be found
[here](docs/curation.md).
## πͺ Getting Started
The `dalia_dif` command line tool can be used from the console with to validate
CSV files (both local and remote).
```console
$ dalia_dif validate https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv
```
Serialize to RDF with `dalia_dif convert`. It guesses the format based on the
file extension, right now `.ttl` and `.jsonl` are supported.
```console
$ dalia_dif convert -o output.ttl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv
```
```console
$ dalia_dif convert -o output.jsonl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv
```
Using the data model:
```python
from pydantic_metamodel.api import PredicateObject
from dalia_dif.dif13.model import AuthorDIF13, EducationalResourceDIF13, OrganizationDIF13
from dalia_dif.dif13.picklists import (
    MEDIA_TYPES,
    PROFICIENCY_LEVELS,
    RELATED_WORKS_RELATIONS,
    TARGET_GROUPS,
    LEARNING_RESOURCE_TYPES,
)
from dalia_dif.namespace import DALIA_COMMUNITY, HSFS
resource = EducationalResourceDIF13(
    uuid="b37ddf6e-f136-4230-8418-faf18c4c34d2",
    title="Chemotion ELN Instruction Videos",
    description="Chemotion ELN Instruction Videos Chemotion[1] is an open source "
                "system for storing and managing experiments and molecular data in "
                "chemistry and its related sciences.",
    links=["https://doi.org/10.5281/zenodo.7634481"],
    authors=[
        AuthorDIF13(given_name="Fabian", family_name="Fink", orcid="0000-0002-1863-2087"),
        AuthorDIF13(given_name="Salim", family_name="Benjamaa", orcid="0000-0001-6215-6834"),
        AuthorDIF13(given_name="Nicole", family_name="Parks", orcid="0000-0002-6243-2840"),
        AuthorDIF13(
            given_name="Alexander", family_name="Hoffmann", orcid="0000-0002-9647-8839"
        ),
        AuthorDIF13(
            given_name="Sonja", family_name="Herres-Pawlis", orcid="0000-0002-4354-4353"
        ),
    ],
    license="https://creativecommons.org/licenses/by/4.0",
    supporting_communities=[],
    recommending_communities=[
        DALIA_COMMUNITY["bead62a8-c3c2-46d6-9eb1-ffeaba38d5bf"],  # NFDI4Chem
    ],
    disciplines=[
        HSFS["n40"],  # chemistry
    ],
    file_formats=[
        ".mp4",
    ],
    keywords=["research data management", "NFDI", "RDM", "FDM", "NFDI4Chem", "Chemotion"],
    languages=["eng"],
    learning_resource_types=[
        LEARNING_RESOURCE_TYPES["tutorial"],
    ],
    media_types=[
        MEDIA_TYPES["video"],
    ],
    proficiency_levels=[
        PROFICIENCY_LEVELS["novice"],
    ],
    publication_date="2023-02-13",
    target_groups=[
        TARGET_GROUPS["student (ba)"],
    ],
    related_works=[
        PredicateObject(
            predicate=RELATED_WORKS_RELATIONS["isTranslationOf"],
            object="https://id.dalia.education/learning-resource/20be255e-e2da-4f9c-90b3-5573d6a12619",
        )
    ],
    file_size="703.2 MB",
    version=None,
)
turtle_str = resource.model_dump_turtle()
```
## π Installation
The most recent release can be installed from
[PyPI](https://pypi.org/project/dalia_dif/) with uv:
```console
$ uv pip install dalia_dif
```
or with pip:
```console
$ python3 -m pip install dalia_dif
```
The most recent code and data can be installed directly from GitHub with uv:
```console
$ uv pip install git+https://github.com/data-literacy-alliance/dalia-dif.git
```
or with pip:
```console
$ python3 -m pip install git+https://github.com/data-literacy-alliance/dalia-dif.git
```
## π Contributing
Contributions, whether filing an issue, making a pull request, or forking, are
appreciated. See
[CONTRIBUTING.md](https://github.com/data-literacy-alliance/dalia-dif/blob/master/.github/CONTRIBUTING.md)
for more information on getting involved.
## π Attribution
### βοΈ License
The code in this package is licensed under the MIT License.
### π Citation
An abstract describing the DIF has been published in the proceedings of the
2<sup>nd</sup> Conference on Research Data Infrastructure (CoRDI).
```bibtex
@misc{steiner2025,
    author = {Steiner, Petra C. and Geiger, Jonathan D. and Fuhrmans, Marc and Amer Desouki, Abdelmoneim and HΓΌppe, Henrika M.},
    title = {The Revised DALIA Interchange Format - New Picklists for Describing Open Educational Resources},
    month = aug,
    year = 2025,
    publisher = {Zenodo},
    doi = {10.5281/zenodo.16736170},
    url = {https://doi.org/10.5281/zenodo.16736170},
}
```
### π Support
This project has been supported by the following organizations (in alphabetical
order):
- [NFDI4Chem](https://www.nfdi4chem.de)
- [NFDI4Culture](https://nfdi4culture.de)
- [NFDI4Ing](https://nfdi4ing.de)
### π° Funding
This project has been supported by the following grants:
| Funding Body                                                       | Program | Grant Number |
| ------------------------------------------------------------------ | ------- | ------------ |
| German Federal Ministry of Research, Technology, and Space (BMFTR) |         | 16DWWQP07    |
| EU Capacity Building and Resilience Facility                       |         | 16DWWQP07    |
### πͺ Cookiecutter
This package was created with
[@audreyfeldroy](https://github.com/audreyfeldroy)'s
[cookiecutter](https://github.com/cookiecutter/cookiecutter) package using
[@cthoyt](https://github.com/cthoyt)'s
[cookiecutter-snekpack](https://github.com/cthoyt/cookiecutter-snekpack)
template.
## π οΈ For Developers
<details>
  <summary>See developer instructions</summary>
The final section of the README is for if you want to get involved by making a
code contribution.
### Development Installation
To install in development mode, use the following:
```console
$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git
$ cd dalia-dif
$ uv pip install -e .
```
Alternatively, install using pip:
```console
$ python3 -m pip install -e .
```
### π₯Ό Testing
After cloning the repository and installing `tox` with
`uv tool install tox --with tox-uv` or `python3 -m pip install tox tox-uv`, the
unit tests in the `tests/` folder can be run reproducibly with:
```console
$ tox -e py
```
Additionally, these tests are automatically re-run with each commit in a
[GitHub Action](https://github.com/data-literacy-alliance/dalia-dif/actions?query=workflow%3ATests).
### π Building the Documentation
The documentation can be built locally using the following:
```console
$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git
$ cd dalia-dif
$ tox -e docs
$ open docs/build/html/index.html
```
The documentation automatically installs the package as well as the `docs` extra
specified in the [`pyproject.toml`](pyproject.toml). `sphinx` plugins like
`texext` can be added there. Additionally, they need to be added to the
`extensions` list in [`docs/source/conf.py`](docs/source/conf.py).
The documentation can be deployed to [ReadTheDocs](https://readthedocs.io) using
[this guide](https://docs.readthedocs.io/en/stable/intro/import-guide.html). The
[`.readthedocs.yml`](.readthedocs.yml) YAML file contains all the configuration
you'll need. You can also set up continuous integration on GitHub to check not
only that Sphinx can build the documentation in an isolated environment (i.e.,
with `tox -e docs-test`) but also that
[ReadTheDocs can build it too](https://docs.readthedocs.io/en/stable/pull-requests.html).
</details>
## π§βπ» For Maintainers
<details>
  <summary>See maintainer instructions</summary>
### Initial Configuration
#### Configuring ReadTheDocs
[ReadTheDocs](https://readthedocs.org) is an external documentation hosting
service that integrates with GitHub's CI/CD. Do the following for each
repository:
1. Log in to ReadTheDocs with your GitHub account to install the integration at
   https://readthedocs.org/accounts/login/?next=/dashboard/
2. Import your project by navigating to https://readthedocs.org/dashboard/import
   then clicking the plus icon next to your repository
3. You can rename the repository on the next screen using a more stylized name
   (i.e., with spaces and capital letters)
4. Click next, and you're good to go!
#### Configuring Archival on Zenodo
[Zenodo](https://zenodo.org) is a long-term archival system that assigns a DOI
to each release of your package. Do the following for each repository:
1. Log in to Zenodo via GitHub with this link:
   https://zenodo.org/oauth/login/github/?next=%2F. This brings you to a page
   that lists all of your organizations and asks you to approve installing the
   Zenodo app on GitHub. Click "grant" next to any organizations you want to
   enable the integration for, then click the big green "approve" button. This
   step only needs to be done once.
2. Navigate to https://zenodo.org/account/settings/github/, which lists all of
   your GitHub repositories (both in your username and any organizations you
   enabled). Click the on/off toggle for any relevant repositories. When you
   make a new repository, you'll have to come back to this
After these steps, you're ready to go! After you make "release" on GitHub (steps
for this are below), you can navigate to
https://zenodo.org/account/settings/github/repository/data-literacy-alliance/dalia-dif
to see the DOI for the release and link to the Zenodo record for it.
#### Registering with the Python Package Index (PyPI)
The [Python Package Index (PyPI)](https://pypi.org) hosts packages so they can
be easily installed with `pip`, `uv`, and equivalent tools.
1. Register for an account [here](https://pypi.org/account/register)
2. Navigate to https://pypi.org/manage/account and make sure you have verified
   your email address. A verification email might not have been sent by default,
   so you might have to click the "options" dropdown next to your address to get
   to the "re-send verification email" button
3. 2-Factor authentication is required for PyPI since the end of 2023 (see this
   [blog post from PyPI](https://blog.pypi.org/posts/2023-05-25-securing-pypi-with-2fa/)).
   This means you have to first issue account recovery codes, then set up
   2-factor authentication
4. Issue an API token from https://pypi.org/manage/account/token
This only needs to be done once per developer.
#### Configuring your machine's connection to PyPI
This needs to be done once per machine.
```console
$ uv tool install keyring
$ keyring set https://upload.pypi.org/legacy/ __token__
$ keyring set https://test.pypi.org/legacy/ __token__
```
Note that this deprecates previous workflows using `.pypirc`.
### π¦ Making a Release
#### Uploading to PyPI
After installing the package in development mode and installing `tox` with
`uv tool install tox --with tox-uv` or `python3 -m pip install tox tox-uv`, run
the following from the console:
```console
$ tox -e finish
```
This script does the following:
1. Uses [bump-my-version](https://github.com/callowayproject/bump-my-version) to
   switch the version number in the `pyproject.toml`, `CITATION.cff`,
   `src/dalia_dif/version.py`, and [`docs/source/conf.py`](docs/source/conf.py)
   to not have the `-dev` suffix
2. Packages the code in both a tar archive and a wheel using
   [`uv build`](https://docs.astral.sh/uv/guides/publish/#building-your-package)
3. Uploads to PyPI using
   [`uv publish`](https://docs.astral.sh/uv/guides/publish/#publishing-your-package).
4. Push to GitHub. You'll need to make a release going with the commit where the
   version was bumped.
5. Bump the version to the next patch. If you made big changes and want to bump
   the version by minor, you can use `tox -e bumpversion -- minor` after.
#### Releasing on GitHub
1. Navigate to https://github.com/data-literacy-alliance/dalia-dif/releases/new
   to draft a new release
2. Click the "Choose a Tag" dropdown and select the tag corresponding to the
   release you just made
3. Click the "Generate Release Notes" button to get a quick outline of recent
   changes. Modify the title and description as you see fit
4. Click the big green "Publish Release" button
This will trigger Zenodo to assign a DOI to your release as well.
### Updating Package Boilerplate
This project uses `cruft` to keep boilerplate (i.e., configuration, contribution
guidelines, documentation configuration) up-to-date with the upstream
cookiecutter package. Install cruft with either `uv tool install cruft` or
`python3 -m pip install cruft` then run:
```console
$ cruft update
```
More info on Cruft's update command is available
[here](https://github.com/cruft/cruft?tab=readme-ov-file#updating-a-project).
</details>
            
         
        Raw data
        
            {
    "_id": null,
    "home_page": null,
    "name": "dalia-dif",
    "maintainer": "Charles Tapley Hoyt",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "Charles Tapley Hoyt <cthoyt@gmail.com>",
    "keywords": "snekpack, cookiecutter, open educational resources, educational resources, training",
    "author": "Charles Tapley Hoyt",
    "author_email": "Charles Tapley Hoyt <cthoyt@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/f0/4a/6b37adba267acbd723a811aadcdcb878c85728d68548f021fc62624cccbf/dalia_dif-0.0.13.tar.gz",
    "platform": null,
    "description": "<!--\n<p align=\"center\">\n  <img src=\"https://github.com/data-literacy-alliance/dalia-dif/raw/main/docs/source/logo.png\" height=\"150\">\n</p>\n-->\n\n<h1 align=\"center\">\n  DALIA Interaction Format (DIF)\n</h1>\n\n<p align=\"center\">\n    <a href=\"https://github.com/data-literacy-alliance/dalia-dif/actions/workflows/tests.yml\">\n        <img alt=\"Tests\" src=\"https://github.com/data-literacy-alliance/dalia-dif/actions/workflows/tests.yml/badge.svg\" /></a>\n    <a href=\"https://pypi.org/project/dalia_dif\">\n        <img alt=\"PyPI\" src=\"https://img.shields.io/pypi/v/dalia_dif\" /></a>\n    <a href=\"https://pypi.org/project/dalia_dif\">\n        <img alt=\"PyPI - Python Version\" src=\"https://img.shields.io/pypi/pyversions/dalia_dif\" /></a>\n    <a href=\"https://github.com/data-literacy-alliance/dalia-dif/blob/main/LICENSE\">\n        <img alt=\"PyPI - License\" src=\"https://img.shields.io/pypi/l/dalia_dif\" /></a>\n    <a href='https://dalia_dif.readthedocs.io/en/latest/?badge=latest'>\n        <img src='https://readthedocs.org/projects/dalia_dif/badge/?version=latest' alt='Documentation Status' /></a>\n    <a href=\"https://codecov.io/gh/data-literacy-alliance/dalia-dif/branch/main\">\n        <img src=\"https://codecov.io/gh/data-literacy-alliance/dalia-dif/branch/main/graph/badge.svg\" alt=\"Codecov status\" /></a>  \n    <a href=\"https://github.com/cthoyt/cookiecutter-python-package\">\n        <img alt=\"Cookiecutter template from @cthoyt\" src=\"https://img.shields.io/badge/Cookiecutter-snekpack-blue\" /></a>\n    <a href=\"https://github.com/astral-sh/ruff\">\n        <img src=\"https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json\" alt=\"Ruff\" style=\"max-width:100%;\"></a>\n    <a href=\"https://github.com/data-literacy-alliance/dalia-dif/blob/main/.github/CODE_OF_CONDUCT.md\">\n        <img src=\"https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg\" alt=\"Contributor Covenant\"/></a>\n    <!-- uncomment if you archive on zenodo\n    <a href=\"https://zenodo.org/badge/latestdoi/XXXXXX\">\n        <img src=\"https://zenodo.org/badge/XXXXXX.svg\" alt=\"DOI\"></a>\n    -->\n</p>\n\nThe DALIA Interaction Format (DIF) v1.3 is the data model and CSV-based input\nformat for open educational resources (OERs) in the DALIA OER platform.\n\nThis repository contains an implementation of the data model in Pydantic, a\nworkflow for serializing to RDF based on\n[pydantic-metamodel](https://github.com/cthoyt/pydantic-metamodel), a CSV\nreader, and a command line validator.\n\nA tutorial/guide for curating OERs in a tabular form (CSV) can be found\n[here](docs/curation.md).\n\n## \ud83d\udcaa Getting Started\n\nThe `dalia_dif` command line tool can be used from the console with to validate\nCSV files (both local and remote).\n\n```console\n$ dalia_dif validate https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv\n```\n\nSerialize to RDF with `dalia_dif convert`. It guesses the format based on the\nfile extension, right now `.ttl` and `.jsonl` are supported.\n\n```console\n$ dalia_dif convert -o output.ttl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv\n```\n\n```console\n$ dalia_dif convert -o output.jsonl https://raw.githubusercontent.com/NFDI4BIOIMAGE/training/refs/heads/main/docs/export/DALIA_training_materials.csv\n```\n\nUsing the data model:\n\n```python\nfrom pydantic_metamodel.api import PredicateObject\n\nfrom dalia_dif.dif13.model import AuthorDIF13, EducationalResourceDIF13, OrganizationDIF13\nfrom dalia_dif.dif13.picklists import (\n    MEDIA_TYPES,\n    PROFICIENCY_LEVELS,\n    RELATED_WORKS_RELATIONS,\n    TARGET_GROUPS,\n    LEARNING_RESOURCE_TYPES,\n)\nfrom dalia_dif.namespace import DALIA_COMMUNITY, HSFS\n\nresource = EducationalResourceDIF13(\n    uuid=\"b37ddf6e-f136-4230-8418-faf18c4c34d2\",\n    title=\"Chemotion ELN Instruction Videos\",\n    description=\"Chemotion ELN Instruction Videos Chemotion[1] is an open source \"\n                \"system for storing and managing experiments and molecular data in \"\n                \"chemistry and its related sciences.\",\n    links=[\"https://doi.org/10.5281/zenodo.7634481\"],\n    authors=[\n        AuthorDIF13(given_name=\"Fabian\", family_name=\"Fink\", orcid=\"0000-0002-1863-2087\"),\n        AuthorDIF13(given_name=\"Salim\", family_name=\"Benjamaa\", orcid=\"0000-0001-6215-6834\"),\n        AuthorDIF13(given_name=\"Nicole\", family_name=\"Parks\", orcid=\"0000-0002-6243-2840\"),\n        AuthorDIF13(\n            given_name=\"Alexander\", family_name=\"Hoffmann\", orcid=\"0000-0002-9647-8839\"\n        ),\n        AuthorDIF13(\n            given_name=\"Sonja\", family_name=\"Herres-Pawlis\", orcid=\"0000-0002-4354-4353\"\n        ),\n    ],\n    license=\"https://creativecommons.org/licenses/by/4.0\",\n    supporting_communities=[],\n    recommending_communities=[\n        DALIA_COMMUNITY[\"bead62a8-c3c2-46d6-9eb1-ffeaba38d5bf\"],  # NFDI4Chem\n    ],\n    disciplines=[\n        HSFS[\"n40\"],  # chemistry\n    ],\n    file_formats=[\n        \".mp4\",\n    ],\n    keywords=[\"research data management\", \"NFDI\", \"RDM\", \"FDM\", \"NFDI4Chem\", \"Chemotion\"],\n    languages=[\"eng\"],\n    learning_resource_types=[\n        LEARNING_RESOURCE_TYPES[\"tutorial\"],\n    ],\n    media_types=[\n        MEDIA_TYPES[\"video\"],\n    ],\n    proficiency_levels=[\n        PROFICIENCY_LEVELS[\"novice\"],\n    ],\n    publication_date=\"2023-02-13\",\n    target_groups=[\n        TARGET_GROUPS[\"student (ba)\"],\n    ],\n    related_works=[\n        PredicateObject(\n            predicate=RELATED_WORKS_RELATIONS[\"isTranslationOf\"],\n            object=\"https://id.dalia.education/learning-resource/20be255e-e2da-4f9c-90b3-5573d6a12619\",\n        )\n    ],\n    file_size=\"703.2 MB\",\n    version=None,\n)\nturtle_str = resource.model_dump_turtle()\n```\n\n## \ud83d\ude80 Installation\n\nThe most recent release can be installed from\n[PyPI](https://pypi.org/project/dalia_dif/) with uv:\n\n```console\n$ uv pip install dalia_dif\n```\n\nor with pip:\n\n```console\n$ python3 -m pip install dalia_dif\n```\n\nThe most recent code and data can be installed directly from GitHub with uv:\n\n```console\n$ uv pip install git+https://github.com/data-literacy-alliance/dalia-dif.git\n```\n\nor with pip:\n\n```console\n$ python3 -m pip install git+https://github.com/data-literacy-alliance/dalia-dif.git\n```\n\n## \ud83d\udc50 Contributing\n\nContributions, whether filing an issue, making a pull request, or forking, are\nappreciated. See\n[CONTRIBUTING.md](https://github.com/data-literacy-alliance/dalia-dif/blob/master/.github/CONTRIBUTING.md)\nfor more information on getting involved.\n\n## \ud83d\udc4b Attribution\n\n### \u2696\ufe0f License\n\nThe code in this package is licensed under the MIT License.\n\n### \ud83d\udcd6 Citation\n\nAn abstract describing the DIF has been published in the proceedings of the\n2<sup>nd</sup> Conference on Research Data Infrastructure (CoRDI).\n\n```bibtex\n@misc{steiner2025,\n    author = {Steiner, Petra C. and Geiger, Jonathan D. and Fuhrmans, Marc and Amer Desouki, Abdelmoneim and H\u00fcppe, Henrika M.},\n    title = {The Revised DALIA Interchange Format - New Picklists for Describing Open Educational Resources},\n    month = aug,\n    year = 2025,\n    publisher = {Zenodo},\n    doi = {10.5281/zenodo.16736170},\n    url = {https://doi.org/10.5281/zenodo.16736170},\n}\n```\n\n### \ud83c\udf81 Support\n\nThis project has been supported by the following organizations (in alphabetical\norder):\n\n- [NFDI4Chem](https://www.nfdi4chem.de)\n- [NFDI4Culture](https://nfdi4culture.de)\n- [NFDI4Ing](https://nfdi4ing.de)\n\n### \ud83d\udcb0 Funding\n\nThis project has been supported by the following grants:\n\n| Funding Body                                                       | Program | Grant Number |\n| ------------------------------------------------------------------ | ------- | ------------ |\n| German Federal Ministry of Research, Technology, and Space (BMFTR) |         | 16DWWQP07    |\n| EU Capacity Building and Resilience Facility                       |         | 16DWWQP07    |\n\n### \ud83c\udf6a Cookiecutter\n\nThis package was created with\n[@audreyfeldroy](https://github.com/audreyfeldroy)'s\n[cookiecutter](https://github.com/cookiecutter/cookiecutter) package using\n[@cthoyt](https://github.com/cthoyt)'s\n[cookiecutter-snekpack](https://github.com/cthoyt/cookiecutter-snekpack)\ntemplate.\n\n## \ud83d\udee0\ufe0f For Developers\n\n<details>\n  <summary>See developer instructions</summary>\n\nThe final section of the README is for if you want to get involved by making a\ncode contribution.\n\n### Development Installation\n\nTo install in development mode, use the following:\n\n```console\n$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git\n$ cd dalia-dif\n$ uv pip install -e .\n```\n\nAlternatively, install using pip:\n\n```console\n$ python3 -m pip install -e .\n```\n\n### \ud83e\udd7c Testing\n\nAfter cloning the repository and installing `tox` with\n`uv tool install tox --with tox-uv` or `python3 -m pip install tox tox-uv`, the\nunit tests in the `tests/` folder can be run reproducibly with:\n\n```console\n$ tox -e py\n```\n\nAdditionally, these tests are automatically re-run with each commit in a\n[GitHub Action](https://github.com/data-literacy-alliance/dalia-dif/actions?query=workflow%3ATests).\n\n### \ud83d\udcd6 Building the Documentation\n\nThe documentation can be built locally using the following:\n\n```console\n$ git clone git+https://github.com/data-literacy-alliance/dalia-dif.git\n$ cd dalia-dif\n$ tox -e docs\n$ open docs/build/html/index.html\n```\n\nThe documentation automatically installs the package as well as the `docs` extra\nspecified in the [`pyproject.toml`](pyproject.toml). `sphinx` plugins like\n`texext` can be added there. Additionally, they need to be added to the\n`extensions` list in [`docs/source/conf.py`](docs/source/conf.py).\n\nThe documentation can be deployed to [ReadTheDocs](https://readthedocs.io) using\n[this guide](https://docs.readthedocs.io/en/stable/intro/import-guide.html). The\n[`.readthedocs.yml`](.readthedocs.yml) YAML file contains all the configuration\nyou'll need. You can also set up continuous integration on GitHub to check not\nonly that Sphinx can build the documentation in an isolated environment (i.e.,\nwith `tox -e docs-test`) but also that\n[ReadTheDocs can build it too](https://docs.readthedocs.io/en/stable/pull-requests.html).\n\n</details>\n\n## \ud83e\uddd1\u200d\ud83d\udcbb For Maintainers\n\n<details>\n  <summary>See maintainer instructions</summary>\n\n### Initial Configuration\n\n#### Configuring ReadTheDocs\n\n[ReadTheDocs](https://readthedocs.org) is an external documentation hosting\nservice that integrates with GitHub's CI/CD. Do the following for each\nrepository:\n\n1. Log in to ReadTheDocs with your GitHub account to install the integration at\n   https://readthedocs.org/accounts/login/?next=/dashboard/\n2. Import your project by navigating to https://readthedocs.org/dashboard/import\n   then clicking the plus icon next to your repository\n3. You can rename the repository on the next screen using a more stylized name\n   (i.e., with spaces and capital letters)\n4. Click next, and you're good to go!\n\n#### Configuring Archival on Zenodo\n\n[Zenodo](https://zenodo.org) is a long-term archival system that assigns a DOI\nto each release of your package. Do the following for each repository:\n\n1. Log in to Zenodo via GitHub with this link:\n   https://zenodo.org/oauth/login/github/?next=%2F. This brings you to a page\n   that lists all of your organizations and asks you to approve installing the\n   Zenodo app on GitHub. Click \"grant\" next to any organizations you want to\n   enable the integration for, then click the big green \"approve\" button. This\n   step only needs to be done once.\n2. Navigate to https://zenodo.org/account/settings/github/, which lists all of\n   your GitHub repositories (both in your username and any organizations you\n   enabled). Click the on/off toggle for any relevant repositories. When you\n   make a new repository, you'll have to come back to this\n\nAfter these steps, you're ready to go! After you make \"release\" on GitHub (steps\nfor this are below), you can navigate to\nhttps://zenodo.org/account/settings/github/repository/data-literacy-alliance/dalia-dif\nto see the DOI for the release and link to the Zenodo record for it.\n\n#### Registering with the Python Package Index (PyPI)\n\nThe [Python Package Index (PyPI)](https://pypi.org) hosts packages so they can\nbe easily installed with `pip`, `uv`, and equivalent tools.\n\n1. Register for an account [here](https://pypi.org/account/register)\n2. Navigate to https://pypi.org/manage/account and make sure you have verified\n   your email address. A verification email might not have been sent by default,\n   so you might have to click the \"options\" dropdown next to your address to get\n   to the \"re-send verification email\" button\n3. 2-Factor authentication is required for PyPI since the end of 2023 (see this\n   [blog post from PyPI](https://blog.pypi.org/posts/2023-05-25-securing-pypi-with-2fa/)).\n   This means you have to first issue account recovery codes, then set up\n   2-factor authentication\n4. Issue an API token from https://pypi.org/manage/account/token\n\nThis only needs to be done once per developer.\n\n#### Configuring your machine's connection to PyPI\n\nThis needs to be done once per machine.\n\n```console\n$ uv tool install keyring\n$ keyring set https://upload.pypi.org/legacy/ __token__\n$ keyring set https://test.pypi.org/legacy/ __token__\n```\n\nNote that this deprecates previous workflows using `.pypirc`.\n\n### \ud83d\udce6 Making a Release\n\n#### Uploading to PyPI\n\nAfter installing the package in development mode and installing `tox` with\n`uv tool install tox --with tox-uv` or `python3 -m pip install tox tox-uv`, run\nthe following from the console:\n\n```console\n$ tox -e finish\n```\n\nThis script does the following:\n\n1. Uses [bump-my-version](https://github.com/callowayproject/bump-my-version) to\n   switch the version number in the `pyproject.toml`, `CITATION.cff`,\n   `src/dalia_dif/version.py`, and [`docs/source/conf.py`](docs/source/conf.py)\n   to not have the `-dev` suffix\n2. Packages the code in both a tar archive and a wheel using\n   [`uv build`](https://docs.astral.sh/uv/guides/publish/#building-your-package)\n3. Uploads to PyPI using\n   [`uv publish`](https://docs.astral.sh/uv/guides/publish/#publishing-your-package).\n4. Push to GitHub. You'll need to make a release going with the commit where the\n   version was bumped.\n5. Bump the version to the next patch. If you made big changes and want to bump\n   the version by minor, you can use `tox -e bumpversion -- minor` after.\n\n#### Releasing on GitHub\n\n1. Navigate to https://github.com/data-literacy-alliance/dalia-dif/releases/new\n   to draft a new release\n2. Click the \"Choose a Tag\" dropdown and select the tag corresponding to the\n   release you just made\n3. Click the \"Generate Release Notes\" button to get a quick outline of recent\n   changes. Modify the title and description as you see fit\n4. Click the big green \"Publish Release\" button\n\nThis will trigger Zenodo to assign a DOI to your release as well.\n\n### Updating Package Boilerplate\n\nThis project uses `cruft` to keep boilerplate (i.e., configuration, contribution\nguidelines, documentation configuration) up-to-date with the upstream\ncookiecutter package. Install cruft with either `uv tool install cruft` or\n`python3 -m pip install cruft` then run:\n\n```console\n$ cruft update\n```\n\nMore info on Cruft's update command is available\n[here](https://github.com/cruft/cruft?tab=readme-ov-file#updating-a-project).\n\n</details>\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Tools for DALIA's data model for open educational resources",
    "version": "0.0.13",
    "project_urls": {
        "Bug Tracker": "https://github.com/data-literacy-alliance/dalia-dif/issues",
        "Documentation": "https://dalia-dif.readthedocs.io",
        "Homepage": "https://github.com/data-literacy-alliance/dalia-dif",
        "Repository": "https://github.com/data-literacy-alliance/dalia-dif.git"
    },
    "split_keywords": [
        "snekpack",
        " cookiecutter",
        " open educational resources",
        " educational resources",
        " training"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a55a5508b6aa528c1de85b2c4e4dcbdeaf0af28b0b1ac35856fd2e4eaf15c5cf",
                "md5": "f323809e3f3c972e0a5cf338041c62b1",
                "sha256": "e91ffc246fd3c4a5a509a790c41e9e4dfbeb5f96ee833579bac7dd8af4663ce5"
            },
            "downloads": -1,
            "filename": "dalia_dif-0.0.13-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f323809e3f3c972e0a5cf338041c62b1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 50609,
            "upload_time": "2025-10-30T12:10:52",
            "upload_time_iso_8601": "2025-10-30T12:10:52.972345Z",
            "url": "https://files.pythonhosted.org/packages/a5/5a/5508b6aa528c1de85b2c4e4dcbdeaf0af28b0b1ac35856fd2e4eaf15c5cf/dalia_dif-0.0.13-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f04a6b37adba267acbd723a811aadcdcb878c85728d68548f021fc62624cccbf",
                "md5": "18964ec3da0ad6e1862fd89930a4829b",
                "sha256": "f570880ab49869769731a3e8f7290bbb244fcce7a5cc5833032fe1e3778ded18"
            },
            "downloads": -1,
            "filename": "dalia_dif-0.0.13.tar.gz",
            "has_sig": false,
            "md5_digest": "18964ec3da0ad6e1862fd89930a4829b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 40122,
            "upload_time": "2025-10-30T12:10:54",
            "upload_time_iso_8601": "2025-10-30T12:10:54.230236Z",
            "url": "https://files.pythonhosted.org/packages/f0/4a/6b37adba267acbd723a811aadcdcb878c85728d68548f021fc62624cccbf/dalia_dif-0.0.13.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-30 12:10:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "data-literacy-alliance",
    "github_project": "dalia-dif",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "tox": true,
    "lcname": "dalia-dif"
}