daml-dit-api
====
API definitions for DIT packages to be hosted in Daml Hub. This mainly
contains the [type definitions](daml_dit_api/package_metadata.py)
for the format of the `dabl-meta.yaml` file at the root of each DIT file.
DIT files are also used to contain integrations loaded and run by Daml
Hub. This repository also contains documentation (in this `README`)
describing the runtime environment that Daml Hub provides to
integrations.
# Package Metadata
At their core, DIT files are [ZIP archives](https://en.wikipedia.org/wiki/Zip_(file_format))
that follow a specific set of conventions regarding their content. The
most important of these conventions is the presence of a YAML metadata
file at the root of the archive and named `dabl-meta.yaml`. This
metadata file contains catalog information describing the contents of
the DIT, as well as any packaging details needed to successfully
deploy a DIT file into Daml Hub. An example of a deployment instruction is
a _subdeployment_. A subdeployment instructs Daml Hub to deploy a specific
subfile within the DIT file. A DIT file that contains an embedded DAR
file could use a subdeployment to ensure that the embedded DAR file is
deployed to the ledger when the DIT is deployed. In this way, a DIT
file composed of multiple artifacts (DARs, Bots, UI's, etc.) can be
constructed to deploy a set of artifacts to a single ledger in a
single action.
# Integrations
Integrations are a special case of DIT file that are augmented with
the ability to run as an executable within a Daml Hub cluster. This is
done by packaging Python [DAZL bot](https://github.com/digital-asset/dazl-client)
code into an [executable ZIP](https://docs.python.org/3/library/zipapp.html)
using [PEX](https://github.com/pantsbuild/pex) and augmenting tha
resulting file with the metadata and other resources needed to make it
a correctly formed DIT file.
Logically speaking, Daml Hub integrations are DAZL bots packaged with
information needed to fit them into the Daml Hub runtime and user
interface. The major functional contrast between a Daml Hub integration
and a Python Bot is that the integration has the external network
access needed to connect to an outside system and the Python Bot does
not. Due to the security implications of running within Daml Hub with
external network access, integrations can only be deployed with the
approval of DA staff.
## Developing Integrations
The easiest way to develop an integration for Daml Hub is to use the
[framework library](https://github.com/digital-asset/daml-dit-if)
and [`ddit` build tool](https://github.com/digital-asset/daml-dit-ddit).
The integration framework presents a Python API closely related to the
DAZL bot api and ensures that integrations follow the conventions
required to integrate into Daml Hub.
_Unless you know exactly what you are doing and why you are doing it,
use the framework._
## The Integration Runtime Environment
By convention, integrations accept a number of environment variables
that specify key paramaters. Integrations built with the framework
use defaults for these variables that connect to a default locally
configured sandbox instance.
Variables provided by Daml Hub include the following:
| Variable | Default | Purpose |
|----------|---------|---------|
| `DAML_LEDGER_URL` | `http://localhost:6865` | Address of local ledger gRPC API |
| `DABL_HEALTH_PORT` | 8089 | Port for HTTP endpoint. (Used for both liveness/readiness and webhooks) |
| `DABL_JWKS_URL` | | HTTP URL for JWKS Repository |
| `DABL_INTEGRATION_METADATA_PATH` | 'int_args.yaml' | Path to local metadata file |
| `DABL_INTEGRATION_TYPE_ID` | | Type ID for the specific integration within the DIT to run |
| `DABL_LEDGER_PARTY` | | Party identifier for network connection |
| `DABL_LOG_LEVEL` | 0 | Log verbosity level - 0 up to 50, inclusive. |
(Note that for legacy reasons, the ledger URL is also available under
the `DABL_LEDGER_URL` environment variable.)
## Logging
DABL integrations use the default Python logging package, and the
framework provides specific support for controlling log level at
runtime. To integrate properly with this logic, it is important that
integrations use the `integration` logger. This logger is switched from
`INFO` level to `DEBUG` level at a `DABL_LOG_LEVEL` setting of 10 or above.
The preferred way of creating an integration logger is via the
`getIntegrationLogger` function in the API package.
Raw data
{
"_id": null,
"home_page": "https://github.com/digital-asset/daml-dit-api",
"name": "daml-dit-api",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "daml,blockchain,dlt,distributed ledger,digital asset",
"author": "Mike Schaeffer",
"author_email": "mike.schaeffer@digitalasset.com",
"download_url": "https://files.pythonhosted.org/packages/01/04/7e605222637f49549f59142387a19a7c222fc01655885bbe1cfd7fe063f6/daml_dit_api-0.5.0.tar.gz",
"platform": null,
"description": "daml-dit-api\n====\n\nAPI definitions for DIT packages to be hosted in Daml Hub. This mainly\ncontains the [type definitions](daml_dit_api/package_metadata.py)\nfor the format of the `dabl-meta.yaml` file at the root of each DIT file.\n\nDIT files are also used to contain integrations loaded and run by Daml\nHub. This repository also contains documentation (in this `README`)\ndescribing the runtime environment that Daml Hub provides to\nintegrations.\n\n# Package Metadata\n\nAt their core, DIT files are [ZIP archives](https://en.wikipedia.org/wiki/Zip_(file_format))\nthat follow a specific set of conventions regarding their content. The\nmost important of these conventions is the presence of a YAML metadata\nfile at the root of the archive and named `dabl-meta.yaml`. This\nmetadata file contains catalog information describing the contents of\nthe DIT, as well as any packaging details needed to successfully\ndeploy a DIT file into Daml Hub. An example of a deployment instruction is\na _subdeployment_. A subdeployment instructs Daml Hub to deploy a specific\nsubfile within the DIT file. A DIT file that contains an embedded DAR\nfile could use a subdeployment to ensure that the embedded DAR file is\ndeployed to the ledger when the DIT is deployed. In this way, a DIT\nfile composed of multiple artifacts (DARs, Bots, UI's, etc.) can be\nconstructed to deploy a set of artifacts to a single ledger in a\nsingle action.\n\n# Integrations\n\nIntegrations are a special case of DIT file that are augmented with\nthe ability to run as an executable within a Daml Hub cluster. This is\ndone by packaging Python [DAZL bot](https://github.com/digital-asset/dazl-client)\ncode into an [executable ZIP](https://docs.python.org/3/library/zipapp.html)\nusing [PEX](https://github.com/pantsbuild/pex) and augmenting tha\nresulting file with the metadata and other resources needed to make it\na correctly formed DIT file.\n\nLogically speaking, Daml Hub integrations are DAZL bots packaged with\ninformation needed to fit them into the Daml Hub runtime and user\ninterface. The major functional contrast between a Daml Hub integration\nand a Python Bot is that the integration has the external network\naccess needed to connect to an outside system and the Python Bot does\nnot. Due to the security implications of running within Daml Hub with\nexternal network access, integrations can only be deployed with the\napproval of DA staff.\n\n## Developing Integrations\n\nThe easiest way to develop an integration for Daml Hub is to use the\n[framework library](https://github.com/digital-asset/daml-dit-if)\nand [`ddit` build tool](https://github.com/digital-asset/daml-dit-ddit).\nThe integration framework presents a Python API closely related to the\nDAZL bot api and ensures that integrations follow the conventions\nrequired to integrate into Daml Hub.\n\n_Unless you know exactly what you are doing and why you are doing it,\nuse the framework._\n\n## The Integration Runtime Environment\n\nBy convention, integrations accept a number of environment variables\nthat specify key paramaters. Integrations built with the framework\nuse defaults for these variables that connect to a default locally\nconfigured sandbox instance.\n\nVariables provided by Daml Hub include the following:\n\n| Variable | Default | Purpose |\n|----------|---------|---------|\n| `DAML_LEDGER_URL` | `http://localhost:6865` | Address of local ledger gRPC API |\n| `DABL_HEALTH_PORT` | 8089 | Port for HTTP endpoint. (Used for both liveness/readiness and webhooks) |\n| `DABL_JWKS_URL` | | HTTP URL for JWKS Repository |\n| `DABL_INTEGRATION_METADATA_PATH` | 'int_args.yaml' | Path to local metadata file |\n| `DABL_INTEGRATION_TYPE_ID` | | Type ID for the specific integration within the DIT to run |\n| `DABL_LEDGER_PARTY` | | Party identifier for network connection |\n| `DABL_LOG_LEVEL` | 0 | Log verbosity level - 0 up to 50, inclusive. |\n\n(Note that for legacy reasons, the ledger URL is also available under\nthe `DABL_LEDGER_URL` environment variable.)\n\n## Logging\n\nDABL integrations use the default Python logging package, and the\nframework provides specific support for controlling log level at\nruntime. To integrate properly with this logic, it is important that\nintegrations use the `integration` logger. This logger is switched from\n`INFO` level to `DEBUG` level at a `DABL_LOG_LEVEL` setting of 10 or above.\n\nThe preferred way of creating an integration logger is via the\n`getIntegrationLogger` function in the API package.\n",
"bugtrack_url": null,
"license": "",
"summary": "Daml Hub DIT File API Package",
"version": "0.5.0",
"project_urls": {
"Homepage": "https://github.com/digital-asset/daml-dit-api",
"Repository": "https://github.com/digital-asset/daml-dit-api"
},
"split_keywords": [
"daml",
"blockchain",
"dlt",
"distributed ledger",
"digital asset"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "606c672cef98ff58c85d281ae3c58c2283614186bb29ef70abf8cc15fe8e17fe",
"md5": "17fc587c0fa866e3268d5edaddeb1c11",
"sha256": "c1e70d292b73c6452a0fd8edfdbc747f25515e8e519cf6ae773bca96176efee9"
},
"downloads": -1,
"filename": "daml_dit_api-0.5.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "17fc587c0fa866e3268d5edaddeb1c11",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 10055,
"upload_time": "2023-11-03T20:25:08",
"upload_time_iso_8601": "2023-11-03T20:25:08.149951Z",
"url": "https://files.pythonhosted.org/packages/60/6c/672cef98ff58c85d281ae3c58c2283614186bb29ef70abf8cc15fe8e17fe/daml_dit_api-0.5.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "01047e605222637f49549f59142387a19a7c222fc01655885bbe1cfd7fe063f6",
"md5": "df75297d4a8bab16050d5ded05d345cc",
"sha256": "0b8069d273a6d783db4c0fff530c40eb06e9f9f44667e2fa5330cd1961b4f2ed"
},
"downloads": -1,
"filename": "daml_dit_api-0.5.0.tar.gz",
"has_sig": false,
"md5_digest": "df75297d4a8bab16050d5ded05d345cc",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 8590,
"upload_time": "2023-11-03T20:25:09",
"upload_time_iso_8601": "2023-11-03T20:25:09.606038Z",
"url": "https://files.pythonhosted.org/packages/01/04/7e605222637f49549f59142387a19a7c222fc01655885bbe1cfd7fe063f6/daml_dit_api-0.5.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-11-03 20:25:09",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "digital-asset",
"github_project": "daml-dit-api",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "daml-dit-api"
}