dapla-statbank-client


Namedapla-statbank-client JSON
Version 1.2.3 PyPI version JSON
download
home_pagehttps://github.com/statisticsnorway/dapla-statbank-client
SummaryHandles data transfer Statbank <-> Dapla for Statistics Norway
upload_time2024-05-22 10:14:52
maintainerNone
docs_urlNone
authorStatistics Norway
requires_python<4,>=3.10
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Dapla Statbank Client


[![PyPI](https://img.shields.io/pypi/v/dapla-statbank-client.svg)][pypi status]
[![Status](https://img.shields.io/pypi/status/dapla-statbank-client.svg)][pypi status]
[![Python Version](https://img.shields.io/pypi/pyversions/dapla-statbank-client)][pypi status]
[![License](https://img.shields.io/pypi/l/dapla-statbank-client)][license]

[![Documentation](https://github.com/statisticsnorway/dapla-statbank-client/actions/workflows/docs.yml/badge.svg)][documentation]
[![Tests](https://github.com/statisticsnorway/dapla-statbank-client/actions/workflows/tests.yml/badge.svg)][tests]
[![Coverage](https://sonarcloud.io/api/project_badges/measure?project=statisticsnorway_dapla-statbank-client&metric=coverage)][sonarcov]
[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=statisticsnorway_dapla-statbank-client&metric=alert_status)][sonarquality]

[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)][pre-commit]
[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)][black]
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)
[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)][poetry]

[pypi status]: https://pypi.org/project/dapla-statbank-client/
[documentation]: https://statisticsnorway.github.io/dapla-statbank-client
[tests]: https://github.com/statisticsnorway/dapla-statbank-client/actions?workflow=Tests

[sonarcov]: https://sonarcloud.io/summary/overall?id=statisticsnorway_dapla-statbank-client
[sonarquality]: https://sonarcloud.io/summary/overall?id=statisticsnorway_dapla-statbank-client
[pre-commit]: https://github.com/pre-commit/pre-commit
[black]: https://github.com/psf/black
[poetry]: https://python-poetry.org/


Used internally by SSB (Statistics Norway).
Validates and transfers data from Dapla to Statbank.
Gets data from public and internal statbank.


## Installing from Pypi with Poetry
If your project has been set up with `ssb-project create`, navigate into the folder with the terminal. `cd project-name`. Then install the package:
```bash
poetry add dapla-statbank-client
ssb-project build
```
Make a notebook with the project's kernel, try this code to verify that you can "log in":
```python
from statbank import StatbankClient
stat_client = StatbankClient()
# Change LASTEBRUKER to your load-statbank-username
# Fill out password
# Default publishing-date is TOMORROW
print(stat_client)
# Printing will show you all the default settings on the client.
# You can change for example date by specifying it: StatbankClient(date="2023-02-16")
```

Be aware that from the **dapla-staging environment** you will be sending to statbank-TEST-database, your changes will not be published. For this you need the "test-password", which is for the same user (lastebruker), but different from the ordinary password (lastepassord). If you are missing the test-password, have the statbank-team send it to you for your loaduser. If you are in the main dapla-jupyterlab (prod), you **WILL** publish to statbanken, in the PROD database. So pay extra attention to the **publishing-date** when in dapla-main-prod-jupyterlab. And be aware of which password you are entering, based on your environment. [To see data actually published to the test-database, you can use this link if you work at SSB.](https://i.test.ssb.no/pxwebi/pxweb/no/test_24v_intern/)


## Usage Transferring

```python
stat_client.transfer({"deltabellfilnavn.dat" : df_06399}, "06339")
```
The simplest form of usage, is directly-transferring using the transfer-method under the client-class. The statbanktable expects named "deltabeller" in a dictionary, see `transferdata_template()` below to easily get the deltabell-names in a dict. This might be all you need if this data has been sent in the same shape to statbanken before... If you are unsure at all, keep reading.


## Building datasets
You can look at the "filbeskrivelse" which is returned from `stat_client.get_description()` in its own local class: StatbankUttrekksBeskrivelse
```python
description_06339 = stat_client.get_description(tableid="06339")
print(description_06339)
```
This should have all the information you are used to reading out from the old "Filbeskrivelse". And describes how you should construct your data.

Your data must be placed in a datastructure, a dict of pandas dataframes. Take a look at how the dict should be constructed with:
```python
description_06339.transferdata_template()
```
This both returns the dict, and prints it, depending on what you want to do with it. Use it to insert your own DataFrames into, and send it to .validate() and/or .transfer(). It might look like this:
```python
{"deltabellfilnavn.dat" : df_06399}
```

Other interesting attributes can be retrieved from the UttrekksBeskrivelse-object:
```python
description_06339.subtables
description_06339.variables
description_06339.codelists
description_06339.suppression
```

After starting to construct your data, you can validate it against the Uttrekksbeskrivelse, using the validate-method, *without starting a transfer*. This is done against validation rules in this package, NOT by actually sending any data to statbanken. Call `vlidate` like this:
```python
stat_client.validate({"deltabellfilnavn.dat" : df_06399}, tableid="06339")
```
Validation will happen by default on user-side, in Python.
Validation happens on the number of tables, number of columns, code usage in categorical columns, code usage in "suppression-columns" (prikkekolonner), and on timeformats (both length and characters used) and more.
**This might be a lot of feedback**, but understanding this will help you to debug what might be wrong with your data, before sending it in.
If your data contains floats, it might hint at you to use the .round_data()-method to prepare your data, it uses the amount of decimals defined in UttrekksBeskrivelse to round UPWARDS (if you have any "pure 0.5 values") and convert to strings with comma as the decimal sign along the way, it is used like this:
```python
data_dict_06339 = description_06339.round_data({"deltabellfilnavn.dat" : df_06399})
```




## Getting apidata

These functions to retrieve public facing data in the statbank, can be imported directly and will then not ask for username and password, but are also available through the client (which asks for username and password during initialization)...
```python
from statbank import apidata_all, apidata, apidata_rotate
```

```python
df_06339 = apidata_all("06339", include_id=True)
```
`apidata_all`, does not need a specified query, it will build its own query, trying to get *all the data* from the table. This might be too much, resulting in an error.

The `include_id`-parameter is a bit *magical*, it gets both codes and value-columns for categorical columns, and tries to merge these next to each other, it also makes a check if the content is the same, then it will not include the content twice.

If you want to specify a query, to limit the response, use the method `apidata` instead.\
Here we are requesting an "internal table" which only people at SSB have access to, with a specified URL and query.
```python
query = {'query': [{'code': 'Region', 'selection': {'filter': 'vs:Landet', 'values': ['0']}}, {'code': 'Alder', 'selection': {'filter': 'vs:AldGrupp19', 'values': ['000', '001', '002', '003', '004', '005', '006', '007', '008', '009', '010', '011', '012', '013', '014', '015', '016', '017', '018', '019', '020', '021', '022', '023', '024', '025', '026', '027', '028', '029', '030', '031', '032', '033', '034', '035', '036', '037', '038', '039', '040', '041', '042', '043', '044', '045', '046', '047', '048', '049', '050', '051', '052', '053', '054', '055', '056', '057', '058', '059', '060', '061', '062', '063', '064', '065', '066', '067', '068', '069', '070', '071', '072', '073', '074', '075', '076', '077', '078', '079', '080', '081', '082', '083', '084', '085', '086', '087', '088', '089', '090', '091', '092', '093', '094', '095', '096', '097', '098', '099', '100', '101', '102', '103', '104', '105', '106', '107', '108', '109', '110', '111', '112', '113', '114', '115', '116', '117', '118', '119+']}}, {'code': 'Statsbrgskap', 'selection': {'filter': 'vs:Statsborgerskap', 'values': ['000']}}, {'code': 'Tid', 'selection': {'filter': 'item', 'values': ['2022']}}], 'response': {'format': 'json-stat2'}}

df_folkemengde = apidata("https://i.ssb.no/pxwebi/api/v0/no/prod_24v_intern/START/be/be01/folkemengde/Rd0002Aa",
                                     query,
                                     include_id = True
                                    )
```

`apimetadata` gets metadata from the *public* api, like apidata does.
```python
meta = apimetadata("05300")
```

`apicodelist` gets a specific codelist out of the metadata, or all the codelists.
```python
all_codelists = apimetadata("05300")
avstand_codelist = apimetadata("05300", "Avstand1")
```


`apidata_rotate` is a thin wrapper around pivot_table. Stolen from: https://github.com/sehyoun/SSB_API_helper/blob/master/src/ssb_api_helper.py
```python
df_folkemengde_rotert = apidata_rotate(df_folkemengde, 'tidskolonne', "verdikolonne")
```


## Using a date-widget for publish day
For easier setting of the date on the client, after it has been initialized, you can use a date-picker in JupyterLab from ipywidgets.
```python
date = stat_client.date_picker()
date
# Do a cell shift here, run the cell above and then set the date (dont run the cell again, cause youll have to set the data again).
# When this is then run, it should update the date on the client:
stat_client.set_publish_date(date)
```


## Saving and restoring Uttrekksbeskrivelser and Transfers as json

From `stat_client.transfer()` you will recieve a StatbankTransfer object, from `stat_client.get_description()` a StatbankUttrekksBeskrivelse-object. These can be serialized and saved to disk, and later be restored, maybe this can be a form of logging on which transfers were done?
This can also be used to:
1. have one notebook get all the descriptions of all tables produced from the pipeline (requires password).
1. then have a notebook for each table, restoring the description from the local jsos, which can actually run .validate (without typing in password).
1. then have a notebook at the end, that sends all the tables (requiring password-entry a second time).

```python
filbesk_06339 = stat_client.get_description("06339")
filbesk_06339.to_json("path.json")
# Later the file can be restored with
filbesk_06339_new = stat_client.read_description_json("path.json")
```
Some deeper data-structures, like the dataframes in the transfer will not be serialized and stored with the transfer-object in its json. All request-parts that might include Auth are stripped.

---


## The logger
Statbank-package makes its own logger using the python logging package. The logger is available at `statbank.logger`.
A lot of the validations are logged as the level "info", if they seem ok.
Or on the level "warning" if things are not ok. The levels are colorized with colorama, green for INFO, magenta for WARNING.
If you *dont* want to see the info-parts of the validate-method, you can change the loggers level before calling validate, like this:
```python
import statbank
import logging
statbank.logger.setLevel(logging.WARNING)
```

## License

Distributed under the terms of the [MIT license][license],
_Dapla Statbank Client_ is free and open source software.

## Issues

If you encounter any problems,
please [file an issue] along with a detailed description.

## Credits

This project was generated from [Statistics Norway]'s [SSB PyPI Template].

[statistics norway]: https://www.ssb.no/en
[pypi]: https://pypi.org/
[ssb pypi template]: https://github.com/statisticsnorway/ssb-pypitemplate
[file an issue]: https://github.com/statisticsnorway/dapla-statbank-client/issues
[pip]: https://pip.pypa.io/

<!-- github-only -->

[license]: https://github.com/statisticsnorway/dapla-statbank-client/blob/main/LICENSE
[contributor guide]: https://github.com/statisticsnorway/dapla-statbank-client/blob/main/CONTRIBUTING.md
[reference guide]: https://statisticsnorway.github.io/dapla-statbank-client/reference.html


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/statisticsnorway/dapla-statbank-client",
    "name": "dapla-statbank-client",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Statistics Norway",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/a8/47/2811c9923440e8b74d52664a84cfb915659f3da5aeb4199ecddd1c5a6456/dapla_statbank_client-1.2.3.tar.gz",
    "platform": null,
    "description": "# Dapla Statbank Client\n\n\n[![PyPI](https://img.shields.io/pypi/v/dapla-statbank-client.svg)][pypi status]\n[![Status](https://img.shields.io/pypi/status/dapla-statbank-client.svg)][pypi status]\n[![Python Version](https://img.shields.io/pypi/pyversions/dapla-statbank-client)][pypi status]\n[![License](https://img.shields.io/pypi/l/dapla-statbank-client)][license]\n\n[![Documentation](https://github.com/statisticsnorway/dapla-statbank-client/actions/workflows/docs.yml/badge.svg)][documentation]\n[![Tests](https://github.com/statisticsnorway/dapla-statbank-client/actions/workflows/tests.yml/badge.svg)][tests]\n[![Coverage](https://sonarcloud.io/api/project_badges/measure?project=statisticsnorway_dapla-statbank-client&metric=coverage)][sonarcov]\n[![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=statisticsnorway_dapla-statbank-client&metric=alert_status)][sonarquality]\n\n[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)][pre-commit]\n[![Black](https://img.shields.io/badge/code%20style-black-000000.svg)][black]\n[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/astral-sh/ruff/main/assets/badge/v2.json)](https://github.com/astral-sh/ruff)\n[![Poetry](https://img.shields.io/endpoint?url=https://python-poetry.org/badge/v0.json)][poetry]\n\n[pypi status]: https://pypi.org/project/dapla-statbank-client/\n[documentation]: https://statisticsnorway.github.io/dapla-statbank-client\n[tests]: https://github.com/statisticsnorway/dapla-statbank-client/actions?workflow=Tests\n\n[sonarcov]: https://sonarcloud.io/summary/overall?id=statisticsnorway_dapla-statbank-client\n[sonarquality]: https://sonarcloud.io/summary/overall?id=statisticsnorway_dapla-statbank-client\n[pre-commit]: https://github.com/pre-commit/pre-commit\n[black]: https://github.com/psf/black\n[poetry]: https://python-poetry.org/\n\n\nUsed internally by SSB (Statistics Norway).\nValidates and transfers data from Dapla to Statbank.\nGets data from public and internal statbank.\n\n\n## Installing from Pypi with Poetry\nIf your project has been set up with `ssb-project create`, navigate into the folder with the terminal. `cd project-name`. Then install the package:\n```bash\npoetry add dapla-statbank-client\nssb-project build\n```\nMake a notebook with the project's kernel, try this code to verify that you can \"log in\":\n```python\nfrom statbank import StatbankClient\nstat_client = StatbankClient()\n# Change LASTEBRUKER to your load-statbank-username\n# Fill out password\n# Default publishing-date is TOMORROW\nprint(stat_client)\n# Printing will show you all the default settings on the client.\n# You can change for example date by specifying it: StatbankClient(date=\"2023-02-16\")\n```\n\nBe aware that from the **dapla-staging environment** you will be sending to statbank-TEST-database, your changes will not be published. For this you need the \"test-password\", which is for the same user (lastebruker), but different from the ordinary password (lastepassord). If you are missing the test-password, have the statbank-team send it to you for your loaduser. If you are in the main dapla-jupyterlab (prod), you **WILL** publish to statbanken, in the PROD database. So pay extra attention to the **publishing-date** when in dapla-main-prod-jupyterlab. And be aware of which password you are entering, based on your environment. [To see data actually published to the test-database, you can use this link if you work at SSB.](https://i.test.ssb.no/pxwebi/pxweb/no/test_24v_intern/)\n\n\n## Usage Transferring\n\n```python\nstat_client.transfer({\"deltabellfilnavn.dat\" : df_06399}, \"06339\")\n```\nThe simplest form of usage, is directly-transferring using the transfer-method under the client-class. The statbanktable expects named \"deltabeller\" in a dictionary, see `transferdata_template()` below to easily get the deltabell-names in a dict. This might be all you need if this data has been sent in the same shape to statbanken before... If you are unsure at all, keep reading.\n\n\n## Building datasets\nYou can look at the \"filbeskrivelse\" which is returned from `stat_client.get_description()` in its own local class: StatbankUttrekksBeskrivelse\n```python\ndescription_06339 = stat_client.get_description(tableid=\"06339\")\nprint(description_06339)\n```\nThis should have all the information you are used to reading out from the old \"Filbeskrivelse\". And describes how you should construct your data.\n\nYour data must be placed in a datastructure, a dict of pandas dataframes. Take a look at how the dict should be constructed with:\n```python\ndescription_06339.transferdata_template()\n```\nThis both returns the dict, and prints it, depending on what you want to do with it. Use it to insert your own DataFrames into, and send it to .validate() and/or .transfer(). It might look like this:\n```python\n{\"deltabellfilnavn.dat\" : df_06399}\n```\n\nOther interesting attributes can be retrieved from the UttrekksBeskrivelse-object:\n```python\ndescription_06339.subtables\ndescription_06339.variables\ndescription_06339.codelists\ndescription_06339.suppression\n```\n\nAfter starting to construct your data, you can validate it against the Uttrekksbeskrivelse, using the validate-method, *without starting a transfer*. This is done against validation rules in this package, NOT by actually sending any data to statbanken. Call `vlidate` like this:\n```python\nstat_client.validate({\"deltabellfilnavn.dat\" : df_06399}, tableid=\"06339\")\n```\nValidation will happen by default on user-side, in Python.\nValidation happens on the number of tables, number of columns, code usage in categorical columns, code usage in \"suppression-columns\" (prikkekolonner), and on timeformats (both length and characters used) and more.\n**This might be a lot of feedback**, but understanding this will help you to debug what might be wrong with your data, before sending it in.\nIf your data contains floats, it might hint at you to use the .round_data()-method to prepare your data, it uses the amount of decimals defined in UttrekksBeskrivelse to round UPWARDS (if you have any \"pure 0.5 values\") and convert to strings with comma as the decimal sign along the way, it is used like this:\n```python\ndata_dict_06339 = description_06339.round_data({\"deltabellfilnavn.dat\" : df_06399})\n```\n\n\n\n\n## Getting apidata\n\nThese functions to retrieve public facing data in the statbank, can be imported directly and will then not ask for username and password, but are also available through the client (which asks for username and password during initialization)...\n```python\nfrom statbank import apidata_all, apidata, apidata_rotate\n```\n\n```python\ndf_06339 = apidata_all(\"06339\", include_id=True)\n```\n`apidata_all`, does not need a specified query, it will build its own query, trying to get *all the data* from the table. This might be too much, resulting in an error.\n\nThe `include_id`-parameter is a bit *magical*, it gets both codes and value-columns for categorical columns, and tries to merge these next to each other, it also makes a check if the content is the same, then it will not include the content twice.\n\nIf you want to specify a query, to limit the response, use the method `apidata` instead.\\\nHere we are requesting an \"internal table\" which only people at SSB have access to, with a specified URL and query.\n```python\nquery = {'query': [{'code': 'Region', 'selection': {'filter': 'vs:Landet', 'values': ['0']}}, {'code': 'Alder', 'selection': {'filter': 'vs:AldGrupp19', 'values': ['000', '001', '002', '003', '004', '005', '006', '007', '008', '009', '010', '011', '012', '013', '014', '015', '016', '017', '018', '019', '020', '021', '022', '023', '024', '025', '026', '027', '028', '029', '030', '031', '032', '033', '034', '035', '036', '037', '038', '039', '040', '041', '042', '043', '044', '045', '046', '047', '048', '049', '050', '051', '052', '053', '054', '055', '056', '057', '058', '059', '060', '061', '062', '063', '064', '065', '066', '067', '068', '069', '070', '071', '072', '073', '074', '075', '076', '077', '078', '079', '080', '081', '082', '083', '084', '085', '086', '087', '088', '089', '090', '091', '092', '093', '094', '095', '096', '097', '098', '099', '100', '101', '102', '103', '104', '105', '106', '107', '108', '109', '110', '111', '112', '113', '114', '115', '116', '117', '118', '119+']}}, {'code': 'Statsbrgskap', 'selection': {'filter': 'vs:Statsborgerskap', 'values': ['000']}}, {'code': 'Tid', 'selection': {'filter': 'item', 'values': ['2022']}}], 'response': {'format': 'json-stat2'}}\n\ndf_folkemengde = apidata(\"https://i.ssb.no/pxwebi/api/v0/no/prod_24v_intern/START/be/be01/folkemengde/Rd0002Aa\",\n                                     query,\n                                     include_id = True\n                                    )\n```\n\n`apimetadata` gets metadata from the *public* api, like apidata does.\n```python\nmeta = apimetadata(\"05300\")\n```\n\n`apicodelist` gets a specific codelist out of the metadata, or all the codelists.\n```python\nall_codelists = apimetadata(\"05300\")\navstand_codelist = apimetadata(\"05300\", \"Avstand1\")\n```\n\n\n`apidata_rotate` is a thin wrapper around pivot_table. Stolen from: https://github.com/sehyoun/SSB_API_helper/blob/master/src/ssb_api_helper.py\n```python\ndf_folkemengde_rotert = apidata_rotate(df_folkemengde, 'tidskolonne', \"verdikolonne\")\n```\n\n\n## Using a date-widget for publish day\nFor easier setting of the date on the client, after it has been initialized, you can use a date-picker in JupyterLab from ipywidgets.\n```python\ndate = stat_client.date_picker()\ndate\n# Do a cell shift here, run the cell above and then set the date (dont run the cell again, cause youll have to set the data again).\n# When this is then run, it should update the date on the client:\nstat_client.set_publish_date(date)\n```\n\n\n## Saving and restoring Uttrekksbeskrivelser and Transfers as json\n\nFrom `stat_client.transfer()` you will recieve a StatbankTransfer object, from `stat_client.get_description()` a StatbankUttrekksBeskrivelse-object. These can be serialized and saved to disk, and later be restored, maybe this can be a form of logging on which transfers were done?\nThis can also be used to:\n1. have one notebook get all the descriptions of all tables produced from the pipeline (requires password).\n1. then have a notebook for each table, restoring the description from the local jsos, which can actually run .validate (without typing in password).\n1. then have a notebook at the end, that sends all the tables (requiring password-entry a second time).\n\n```python\nfilbesk_06339 = stat_client.get_description(\"06339\")\nfilbesk_06339.to_json(\"path.json\")\n# Later the file can be restored with\nfilbesk_06339_new = stat_client.read_description_json(\"path.json\")\n```\nSome deeper data-structures, like the dataframes in the transfer will not be serialized and stored with the transfer-object in its json. All request-parts that might include Auth are stripped.\n\n---\n\n\n## The logger\nStatbank-package makes its own logger using the python logging package. The logger is available at `statbank.logger`.\nA lot of the validations are logged as the level \"info\", if they seem ok.\nOr on the level \"warning\" if things are not ok. The levels are colorized with colorama, green for INFO, magenta for WARNING.\nIf you *dont* want to see the info-parts of the validate-method, you can change the loggers level before calling validate, like this:\n```python\nimport statbank\nimport logging\nstatbank.logger.setLevel(logging.WARNING)\n```\n\n## License\n\nDistributed under the terms of the [MIT license][license],\n_Dapla Statbank Client_ is free and open source software.\n\n## Issues\n\nIf you encounter any problems,\nplease [file an issue] along with a detailed description.\n\n## Credits\n\nThis project was generated from [Statistics Norway]'s [SSB PyPI Template].\n\n[statistics norway]: https://www.ssb.no/en\n[pypi]: https://pypi.org/\n[ssb pypi template]: https://github.com/statisticsnorway/ssb-pypitemplate\n[file an issue]: https://github.com/statisticsnorway/dapla-statbank-client/issues\n[pip]: https://pip.pypa.io/\n\n<!-- github-only -->\n\n[license]: https://github.com/statisticsnorway/dapla-statbank-client/blob/main/LICENSE\n[contributor guide]: https://github.com/statisticsnorway/dapla-statbank-client/blob/main/CONTRIBUTING.md\n[reference guide]: https://statisticsnorway.github.io/dapla-statbank-client/reference.html\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Handles data transfer Statbank <-> Dapla for Statistics Norway",
    "version": "1.2.3",
    "project_urls": {
        "Changelog": "https://github.com/statisticsnorway/dapla-statbank-client/releases",
        "Documentation": "https://statisticsnorway.github.io/dapla-statbank-client",
        "Homepage": "https://github.com/statisticsnorway/dapla-statbank-client",
        "Repository": "https://github.com/statisticsnorway/dapla-statbank-client"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0902f11f5adcf77016bd9a361cc68bf250e01cd040106a69bb624094a8aeb4c5",
                "md5": "002a6a491586a8f0935f17f0b74f76bf",
                "sha256": "359a9d0b4e1d06f980f0014899bbb3f56736d9731f9ea05dcb9f646a69e0f301"
            },
            "downloads": -1,
            "filename": "dapla_statbank_client-1.2.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "002a6a491586a8f0935f17f0b74f76bf",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4,>=3.10",
            "size": 35619,
            "upload_time": "2024-05-22T10:14:50",
            "upload_time_iso_8601": "2024-05-22T10:14:50.801156Z",
            "url": "https://files.pythonhosted.org/packages/09/02/f11f5adcf77016bd9a361cc68bf250e01cd040106a69bb624094a8aeb4c5/dapla_statbank_client-1.2.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a8472811c9923440e8b74d52664a84cfb915659f3da5aeb4199ecddd1c5a6456",
                "md5": "1177084e60b37e9fa58e17c1786179c1",
                "sha256": "d6ba28bb1914af085becd0e1aab39bb70fec24bc9fa07d1995242d9ac0ca842c"
            },
            "downloads": -1,
            "filename": "dapla_statbank_client-1.2.3.tar.gz",
            "has_sig": false,
            "md5_digest": "1177084e60b37e9fa58e17c1786179c1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4,>=3.10",
            "size": 34361,
            "upload_time": "2024-05-22T10:14:52",
            "upload_time_iso_8601": "2024-05-22T10:14:52.546069Z",
            "url": "https://files.pythonhosted.org/packages/a8/47/2811c9923440e8b74d52664a84cfb915659f3da5aeb4199ecddd1c5a6456/dapla_statbank_client-1.2.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-22 10:14:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "statisticsnorway",
    "github_project": "dapla-statbank-client",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "dapla-statbank-client"
}
        
Elapsed time: 0.24594s