# ParkAPI Sources
ParkAPI Sources is a collection of converters from several different data sources to normalized ParkAPI data. ParkAPI
does support parking for cars, for bikes and lockers. The data model is based on the original ParkAPI project and tries
to stay compatible to DATEX II Parking Publication Light at any extension of the model.
We support following data sources:
| name | data type | purpose | type | uid | realtime |
|----------------------------------------------------------------|-------------|------------|--------------|-----------------------------|----------|
| Aachen | ParkingSite | car | pull | `aachen` | yes |
| APCOA Services | ParkingSite | car | pull | `apcoa` | no |
| Deutsche Bahn | ParkingSite | car & bike | pull | `bahn_v2` | no |
| B+B Parkhaus GmbH & Co. KG | ParkingSite | car | push (xlsx) | `bb_parkhaus` | no |
| Stadt Bietigheim-Bissingen | ParkingSite | car | pull | `bietigheim_bissingen` | yes |
| Barrierefreie Reisekette Baden-Württemberg: PKW-Parkplätze | ParkingSite | car | pull | `bfrk_bw_car` | no |
| Barrierefreie Reisekette Baden-Württemberg: Fahrrad-Parkplätze | ParkingSite | bike | pull | `bfrk_bw_bike` | no |
| Stadt Buchen | ParkingSite | car | push (json) | `buchen` | yes |
| Stadt Ellwangen | ParkingSite | car | push (xlsx) | `ellwangen` | no |
| Esslingen | ParkingSite | car | push (json) | `esslingen` | no |
| Stadt Freiburg | ParkingSite | car | pull | `freiburg` | yes |
| Stadt Freiburg: Statische Behindertenparkplätze | ParkingSpot | car | pull | `freiburg_disabled_static` | no |
| Stadt Freiburg: Behindertenparkplätze mit Sensoren | ParkingSpot | car | pull | `freiburg_disabled_sensors` | yes |
| Stadt Freiburg: Park & Ride Statische Parkplätze | ParkingSite | car | pull | `freiburg_p_r_static` | no |
| Stadt Freiburg: Park & Ride Parkplätze mit Sensoren | ParkingSite | car | pull | `freiburg_p_r_sensors` | yes |
| Stadt Freiburg: Scanner | ParkingSite | car | pull | `freiburg_scanner` | no |
| Friedrichhafen Sensors | ParkingSpot | car | pull | `friedrichshafen_sensors` | yes |
| GOLDBECK Parking Services | ParkingSite | car | push (xlsx) | `goldbeck` | no |
| Stadt Heidelberg | ParkingSite | car | pull | `heidelberg` | yes |
| Stadt Heidelberg EasyPark | ParkingSite | car | pull | `heidelberg_easypark` | no |
| Stadt Herrenberg | ParkingSite | car | pull | `herrenberg` | no |
| Stadt Herrenberg - Munigrid | ParkingSite | bike | pull | `herrenberg_bike` | no |
| PARK SERVICE HÜFNER GmbH + Co. KG | ParkingSite | car | push (xlsx) | `huefner` | no |
| Stadt Karlsruhe: PKW-Parkplätze | ParkingSite | car | pull | `karlsruhe` | yes |
| Stadt Karlsruhe: Fahhrrad-Abstellangen | ParkingSite | bike | pull | `karlsruhe_bike` | no |
| Keltern | ParkingSite | car | push (xlsx) | `keltern` | no |
| Kienzler: Bike and Ride | ParkingSite | bike | pull | `kienzler_bike_and_ride` | yes |
| Kienzler: Karlsruhe | ParkingSite | bike | pull | `kienzler_karlruhe` | yes |
| Kienzler: Neckarsulm | ParkingSite | bike | pull | `kienzler_neckarsulm` | yes |
| Kienzler: Offenburg | ParkingSite | bike | pull | `kienzler_offenburg` | yes |
| Kienzler: RadSafe | ParkingSite | bike | pull | `kienzler_rad_safe` | yes |
| Kienzler: Stuttgart | ParkingSite | bike | pull | `kienzler_stuttgart` | yes |
| Kienzler: VRN | ParkingSite | bike | pull | `kienzler_vrn` | yes |
| Konstanz | ParkingSite | car | pull | `konstanz` | yes |
| Stadt Konstanz: Fahrrad-Abstellanlagen | ParkingSite | bike | push | `konstanz_bike` | no |
| Stadt Mannheim | ParkingSite | car | push (json) | `mannheim` | yes |
| Stadt Neckarsulm: PKW-Parkplätze | ParkingSite | car | pull | `neckarsulm` | no |
| Stadt Neckarsulm: Fahrrad-Abstellanlagen | ParkingSite | bike | pull | `neckarsulm_bike` | no |
| Open-Data-Plattform öV Schweiz | ParkingSite | car | pull (json) | `opendata_swiss` | no |
| P + M Baden-Württemberg | ParkingSite | car | pull | `p_m_bw` | yes |
| Baden-Württemberg: Parken und Mitfahren | ParkingSite | car | push (xlsx) | `pum_bw` | no |
| RadVIS Baden-Württemberg (experimental) | ParkingSite | bike | pull | `radvis_bw` | no |
| Parkraumgesellschaft Baden-Württemberg | ParkingSite | car | pull | `pbw` | yes |
| Stadt Pforzheim | ParkingSite | car | push (csv) | `pforzheim` | no |
| Stadt Reutlingen: PKW-Parkplätze | ParkingSite | car | push (csv) | `reutlingen` | no |
| Stadt Reutlingen: Fahrrad-Abstellanlagen | ParkingSite | bike | push (csv) | `reutlingen_bike` | no |
| Stadt Reutlingen: Behindertenparkplätze | ParkingSpot | car | push (csv) | `reutlingen_disabled` | no |
| Stadt Stuttgart | ParkingSite | car | push (json) | `stuttgart` | yes |
| Stadt Ulm | ParkingSite | car | pull | `ulm` | yes |
| Stadt Ulm: E-Quartiershubs Sensors | ParkingSite | car | pull | `ulm_sensors` | yes |
| Stadt Ulm: E-Quartiershubs Sensors | ParkingSpot | car | pull | `ulm_sensors` | yes |
| Velobrix | ParkingSite | bike | pull | `velobrix` | yes |
| Verkehrsverbund Rhein-Neckar GmbH: P+R Parkplätze | ParkingSite | car | pull | `vrn_p_r` | yes |
| Verband Region Stuttgart: Bondorf | ParkingSite | car | pull | `vrs_bondorf` | yes |
| Verband Region Stuttgart: Kirchheim | ParkingSite | car | pull | `vrs_kirchheim` | yes |
| Verband Region Stuttgart: Neustadt | ParkingSite | car | pull | `vrs_neustadt` | yes |
| Verband Region Stuttgart: Park and Ride | ParkingSite | car | push (xlsx) | `vrs_p_r` | no |
| Verband Region Stuttgart: Vaihingen | ParkingSite | car | pull | `vrs_vaihingen` | yes |
New converters for new sources are always welcome, please have a look at "Contribute" below.
## Install
ParkAPI Sources is a python module published at [PyPI](https://pypi.org/project/parkapi-sources/). Therefore, you can install it by
```shell
pip install parkapi-sources
```
If you use parkapi-sources in a project, we recommend to fix the version. As long as parkapi-sources is beta, breaking
changes might be introduced on minor version level (like: change from 0.1.1 to 0.2.0). As soon as 1.0 is released, we
will follow [Semantic Versioning](https://semver.org), which means that breaking changes will just appear on major version changes
(like: change from 1.1.2 to 2.0.0). You can expect a lot of changes in the minor version level, as any new converter is
a new feature.
## Usage
Your starting point is always the `ParkAPISources` where all Sources are registered.
```python
from parkapi_sources import ParkAPISources
my_sources = ParkAPISources()
```
`ParkAPISources` accepts following parameters:
- `config: Optional[dict] = None` is a dictionary for config values, especially secrets.
- `converter_uids: Optional[list[str]] = None` is used for loading just the converter uids you want to load
- `no_pull_converter: bool = False` is used for limiting converters to pull converters
- `no_push_converter: bool = False` is used for limiting converters to push converters
- `custom_converters: list[BaseConverter] = None` is used for additional custom converters
### Configuration
Config values are mostly individual for specific converters: if there are required config values, they are defined at
the converter definition right at the top:
```
required_config_keys = ['MY_SECRET']
```
`ParkAPISources` offers a method to check if all config values are set:
```python
from parkapi_sources import ParkAPISources
my_sources = ParkAPISources()
my_sources.check_credentials()
```
If not all config values are set, a `MissingConfigException` is thrown. It's recommended to run this check after
initializing the module to prevent exceptions during runtime.
Besides converter-individual config values, there are two global values which can be used to configure the source of
GeoJSON files. Per default, static GeoJSON files are fetched from this repository. This behaviour can be changed:
- `STATIC_GEOJSON_BASE_URL` defines another base URL for GeoJSON files
- `STATIC_GEOJSON_BASE_PATH` defines a lokal path instead, so the application will load files locally without network
requests
### Use converters
After initializing, you will find all initialized converters at `ParkAPISources.converter_by_uid`. As the input is very
different, so are the methods you have to use. In general, you can differ between two major strategies,
pull- and push-converters. Also, each converter has to define which data it provides by using the corresponding parent
classes, currently `ParkingSite` for parking sites and `ParkingSpot` for parking spots
### Pull converters
Pull converters are responsible for getting data from an external data source. This can be an REST endpoints as well as
HTML which is scraped. Pull converters always split up in static and realtime data, because at most sources, this is
not the same. Each pull converter has at least a method for static parking sites, or two if it supports realtime data.
For `ParkingSite`s, it's
1) `get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:`
2) `get_realtime_parking_sites(self) -> tuple[list[RealtimeParkingSiteInput], list[ImportParkingSiteException]]:`
For `ParkingSpot`s, it's
1) `get_static_parking_spots(self) -> tuple[list[StaticParkingSpotInput], list[ImportParkingSpotException]]:`
2) `get_realtime_parking_spots(self) -> tuple[list[RealtimeParkingSpotInput], list[ImportParkingSpotException]]:`
### Push converters
Push converters are responsible to handle data which is pushed to the service using defined endpoints. Usually, these
converters are used as a handler behind HTTP endpoints, but of course you can use them in other ways, too, for example
command line scripts.
Push converters always handle specific formats, therefore, there are multiple types of push converters. All push
converters return a `tuple[list[StaticParkingSiteInput | RealtimeParkingSiteInput], list[ImportParkingSiteException]]`,
therefore they decided based on the given data if it's static or realtime data they got - or even both, then each
dataset ends up in two entries in the first list.
1) A `CsvConverter` handles CSV files: `handle_csv_string(self, data: StringIO)`
2) A `JsonConverter` handles JSON based data: `handle_json(self, data: dict | list)`
3) A `XlsxConverter` handles XMLX data: `def handle_xlsx(self, workbook: Workbook)` parsed by `openpyxl`
4) A `XmlConverter` handles XML data: `def handle_xml(self, root: Element)` parsed by `lxml`
### Results
At `webapp/models/parking_site_inputs.py`, you can find the definition of `StaticParkingSiteInput` and
`RealtimeParkingSiteInput`. These `dataclasses` are also [`validataclasses`](https://pypi.org/project/validataclass/), so you can be sure that the data
you get is validated.
### Patch data with local files
If `PARK_API_PARKING_SITE_PATCH_DIR` is set, all pull converters will check if there is a JSON file in this directory
called `source_uid.json` (replace `source_uid` with the source you want to patch). It expects a ParkAPI JSON format
with `uid` as the only required field. A file might look like this:
```
{
"items": [
{
"uid": "my-uid",
"name": "New name"
}
]
}
```
### Debugging
In order to debug ParkAPI Sources, there are two config values which can be used to dump all the requests. Before doing
this, please keep in mind that this might end into a lot of data on your disk, especially in case of realtime enabled
sources. Please also keep in mind that dumps will contain credentials in case of sources with credentials, so please
handle this data the same way as you handle passwords.
Two config values are required for enabling debugging:
- `DEBUG_SOURCES` should be a list of source uids which should be debugged
- `DEBUG_DUMP_DIR` is the directory where the requests get dumped to
Setting these two values will make ParkAPI Sources save all requests in
`{DEBUG_DUMP_DIR}/{source_uid}/{datetime}-{type}`, with type:
- `metadata`: Metadata like url, http method, http status, request and response headers and request body
- `response-body`: The response body
With this data, you can start a deeper analysis why things don't work as expected.
## Contribute
### Report bugs
As ParkAPI-Sources integrates a lot of external data sources, sometimes without a proper definition, converters might
run into issues because of changes on the data source side. If you see that happening, please add a bug report at the
[issues](https://github.com/ParkenDD/parkapi-sources-v3/issues). If possible, please add the data which fails.
### Contribute new source data
If you see a nice data source which is not covered by ParkAPI-sources, you can always create a feature request at our
[issues](https://github.com/ParkenDD/parkapi-sources-v3/issues). If you do so, please add the data you found, so we can actually build
the converter. If possible, please try to find out a licence, too.
### Write a new Converters
We always welcome merge requests with new converters. A merge request should contain the following:
* MIT licenced code
* A converter which validates the input in a way that the output follows the data model
* A test with example data to ensure that the converter works with current data
## Write a new converter
First you have to determine which type of converter you need. If you get the data from an endpoint, you will need a
`PushConverter`, if you have a file you want to push via HTTP or CLI, you need a `PullConverter`.
### Write the converter
In order to write a converter, you need a directory at `converters`. Please name your directory in a way that it points
to the actual converter you will write. If it's just one converter, the `uid` is usually the best approach.
At `converters/your-converter`, you will at least need a `converter.py` and an `__init__.py`. In most cases, you will
also need some `validataclasses` you can put in `models.py`. Validation is crucial in this library, because the users
of this library should not think invalid data. Additionally, if you have very specific new data types, you can write
new `validataclass` validators you can usually put in `validators.py`.
### Testing the converter
In order to proof that the validator works, we need to test the basic functionality. In order to do this, we need a
snapshot of the data which should be integrated. Common place for this data is
`tests/converters/data/filename-starting-with-uid.ending`. This data should be used in one or multiple tests (in
several cases two tests, one for static, one for realtime data) stored at `tests/converters/uid_test.py`.
If you test a `PullConverter`, you will need no mock requests. This can be done using the fantastic
[`requests_mock`](https://pypi.org/project/requests-mock/) library.
If you created new validators, these should be tested with different inputs. Usually, `pytest.parametrize` is a nice
approach to do this.
### Migrate a converter
If you want to migrate a v1 or v2 converter, you can re-use some of the code. There is a paradigm change, though:
`parkapi-source-v3` enforces a strict validation after transforming the data, while v1 and v2 converters don't.
ParkAPI v1 / v2 converters are always pull converters, so the base class is always `PullConverter`.
Instead of defining `POOL`, you will set `source_info` at the same place. Attributes are almost the same, except for
`id` was renamed to `uid`, and there is the new attribute `has_realtime_data`, which has to be set.
ParkAPI v1 and v2 used two methods for static and realtime data, just as `parkapi-sources-v3`:
- the old static data handling `def get_lot_infos(self) -> List[LotInfo]:` is
`get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:` in
`parkapi-sources-v3`.
- the old realtime data handling`def get_lot_data(self) -> List[LotData]:` is
`def get_realtime_parking_sites(self) -> tuple[list[RealtimeParkingSiteInput], list[ImportParkingSiteException]]:` in
`parkapi-sources-v3`.
The result objects have quite the same idea, too:
- `LotInfo` gets `StaticParkingSiteInput`
- `LotData` gets `RealtimeParkingSiteInput`
There's also a helper for scraped content: before, there was `self.request_soup(self.POOL.public_url)` in order to get
a `BeautifulSoup` element. Now, there is a helper mixin called `PullScraperMixin`. You can use it this way:
```
class MyPullConverter(PullConverter, PullScraperMixin):
```
Additionally, there is another mixin for the GeoJSON files you already know from v1 and v2 converters:
`StaticGeojsonDataMixin`. Using this, you can just define the static data method this way:
```
def get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:
return self._get_static_parking_site_inputs_and_exceptions(source_uid=self.source_info.uid)
```
The default location for GeoJSON files is a [separate repository](https://github.com/ParkenDD/parkapi-static-data).
Please keep in mind that you will have to add tests for the migrated scraper.
### Linting
As we try to keep a consistent code style, please lint your code before creating the merge request. We use `ruff` for
linting and formatting. There is Makefile target to do both: `make lint`. It runs the following commands:
```bash
ruff format ./src ./tests
ruff check --fix ./src ./tests
```
If you don't have `ruff` installed globally, you can create a virtual environment for these tools:
```bash
virtualenv venv
source venv/bin/activate
pip install -r requirements.txt -r requirements-dev.txt
ruff format ./src ./tests
ruff check --fix ./src ./tests
```
### Make your new converter available
All available converters should be registered at the `ParkAPISources` class in order to make them accessible for users
of this library, so please register your converter there. The new converter should also be added to the table in this
README.md file.
### Release process
If you created a merge request, the maintainers will review your code. If everything is fine, it will be merged to
`main`, and a new release will be created soon. As written above, we follow SemVer, so any new converter will add plus
one to the minor version. In order to use this new release, please keep in mind to update your
`requirements.txt` / update the dependency manager you use.
## Licence
This library is under MIT licence. Please look at `LICENCE.txt` for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "parkapi-sources",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "\"Ernesto Ruge, binary butterfly GmbH\" <ernesto.ruge@binary-butterfly.de>",
"keywords": "data, parking, converter, mobility, car, bike",
"author": null,
"author_email": "\"Ernesto Ruge, binary butterfly GmbH\" <ernesto.ruge@binary-butterfly.de>",
"download_url": "https://files.pythonhosted.org/packages/76/f8/d8e5a329ba9b7b9947f04e5fe57ec559480bbc1fccd864a31486d8ae65aa/parkapi_sources-0.23.0.tar.gz",
"platform": null,
"description": "# ParkAPI Sources\n\nParkAPI Sources is a collection of converters from several different data sources to normalized ParkAPI data. ParkAPI\ndoes support parking for cars, for bikes and lockers. The data model is based on the original ParkAPI project and tries\nto stay compatible to DATEX II Parking Publication Light at any extension of the model.\n\nWe support following data sources:\n\n| name | data type | purpose | type | uid | realtime |\n|----------------------------------------------------------------|-------------|------------|--------------|-----------------------------|----------|\n| Aachen | ParkingSite | car | pull | `aachen` | yes |\n| APCOA Services | ParkingSite | car | pull | `apcoa` | no |\n| Deutsche Bahn | ParkingSite | car & bike | pull | `bahn_v2` | no |\n| B+B Parkhaus GmbH & Co. KG | ParkingSite | car | push (xlsx) | `bb_parkhaus` | no |\n| Stadt Bietigheim-Bissingen | ParkingSite | car | pull | `bietigheim_bissingen` | yes |\n| Barrierefreie Reisekette Baden-W\u00fcrttemberg: PKW-Parkpl\u00e4tze | ParkingSite | car | pull | `bfrk_bw_car` | no |\n| Barrierefreie Reisekette Baden-W\u00fcrttemberg: Fahrrad-Parkpl\u00e4tze | ParkingSite | bike | pull | `bfrk_bw_bike` | no |\n| Stadt Buchen | ParkingSite | car | push (json) | `buchen` | yes |\n| Stadt Ellwangen | ParkingSite | car | push (xlsx) | `ellwangen` | no |\n| Esslingen | ParkingSite | car | push (json) | `esslingen` | no |\n| Stadt Freiburg | ParkingSite | car | pull | `freiburg` | yes |\n| Stadt Freiburg: Statische Behindertenparkpl\u00e4tze | ParkingSpot | car | pull | `freiburg_disabled_static` | no |\n| Stadt Freiburg: Behindertenparkpl\u00e4tze mit Sensoren | ParkingSpot | car | pull | `freiburg_disabled_sensors` | yes |\n| Stadt Freiburg: Park & Ride Statische Parkpl\u00e4tze | ParkingSite | car | pull | `freiburg_p_r_static` | no |\n| Stadt Freiburg: Park & Ride Parkpl\u00e4tze mit Sensoren | ParkingSite | car | pull | `freiburg_p_r_sensors` | yes |\n| Stadt Freiburg: Scanner | ParkingSite | car | pull | `freiburg_scanner` | no |\n| Friedrichhafen Sensors | ParkingSpot | car | pull | `friedrichshafen_sensors` | yes |\n| GOLDBECK Parking Services | ParkingSite | car | push (xlsx) | `goldbeck` | no |\n| Stadt Heidelberg | ParkingSite | car | pull | `heidelberg` | yes |\n| Stadt Heidelberg EasyPark | ParkingSite | car | pull | `heidelberg_easypark` | no |\n| Stadt Herrenberg | ParkingSite | car | pull | `herrenberg` | no |\n| Stadt Herrenberg - Munigrid | ParkingSite | bike | pull | `herrenberg_bike` | no |\n| PARK SERVICE H\u00dcFNER GmbH + Co. KG | ParkingSite | car | push (xlsx) | `huefner` | no |\n| Stadt Karlsruhe: PKW-Parkpl\u00e4tze | ParkingSite | car | pull | `karlsruhe` | yes |\n| Stadt Karlsruhe: Fahhrrad-Abstellangen | ParkingSite | bike | pull | `karlsruhe_bike` | no |\n| Keltern | ParkingSite | car | push (xlsx) | `keltern` | no |\n| Kienzler: Bike and Ride | ParkingSite | bike | pull | `kienzler_bike_and_ride` | yes |\n| Kienzler: Karlsruhe | ParkingSite | bike | pull | `kienzler_karlruhe` | yes |\n| Kienzler: Neckarsulm | ParkingSite | bike | pull | `kienzler_neckarsulm` | yes |\n| Kienzler: Offenburg | ParkingSite | bike | pull | `kienzler_offenburg` | yes |\n| Kienzler: RadSafe | ParkingSite | bike | pull | `kienzler_rad_safe` | yes |\n| Kienzler: Stuttgart | ParkingSite | bike | pull | `kienzler_stuttgart` | yes |\n| Kienzler: VRN | ParkingSite | bike | pull | `kienzler_vrn` | yes |\n| Konstanz | ParkingSite | car | pull | `konstanz` | yes |\n| Stadt Konstanz: Fahrrad-Abstellanlagen | ParkingSite | bike | push | `konstanz_bike` | no |\n| Stadt Mannheim | ParkingSite | car | push (json) | `mannheim` | yes |\n| Stadt Neckarsulm: PKW-Parkpl\u00e4tze | ParkingSite | car | pull | `neckarsulm` | no |\n| Stadt Neckarsulm: Fahrrad-Abstellanlagen | ParkingSite | bike | pull | `neckarsulm_bike` | no |\n| Open-Data-Plattform \u00f6V Schweiz | ParkingSite | car | pull (json) | `opendata_swiss` | no |\n| P + M Baden-W\u00fcrttemberg | ParkingSite | car | pull | `p_m_bw` | yes |\n| Baden-W\u00fcrttemberg: Parken und Mitfahren | ParkingSite | car | push (xlsx) | `pum_bw` | no |\n| RadVIS Baden-W\u00fcrttemberg (experimental) | ParkingSite | bike | pull | `radvis_bw` | no |\n| Parkraumgesellschaft Baden-W\u00fcrttemberg | ParkingSite | car | pull | `pbw` | yes |\n| Stadt Pforzheim | ParkingSite | car | push (csv) | `pforzheim` | no |\n| Stadt Reutlingen: PKW-Parkpl\u00e4tze | ParkingSite | car | push (csv) | `reutlingen` | no |\n| Stadt Reutlingen: Fahrrad-Abstellanlagen | ParkingSite | bike | push (csv) | `reutlingen_bike` | no |\n| Stadt Reutlingen: Behindertenparkpl\u00e4tze | ParkingSpot | car | push (csv) | `reutlingen_disabled` | no |\n| Stadt Stuttgart | ParkingSite | car | push (json) | `stuttgart` | yes |\n| Stadt Ulm | ParkingSite | car | pull | `ulm` | yes |\n| Stadt Ulm: E-Quartiershubs Sensors | ParkingSite | car | pull | `ulm_sensors` | yes |\n| Stadt Ulm: E-Quartiershubs Sensors | ParkingSpot | car | pull | `ulm_sensors` | yes |\n| Velobrix | ParkingSite | bike | pull | `velobrix` | yes |\n| Verkehrsverbund Rhein-Neckar GmbH: P+R Parkpl\u00e4tze | ParkingSite | car | pull | `vrn_p_r` | yes |\n| Verband Region Stuttgart: Bondorf | ParkingSite | car | pull | `vrs_bondorf` | yes |\n| Verband Region Stuttgart: Kirchheim | ParkingSite | car | pull | `vrs_kirchheim` | yes |\n| Verband Region Stuttgart: Neustadt | ParkingSite | car | pull | `vrs_neustadt` | yes |\n| Verband Region Stuttgart: Park and Ride | ParkingSite | car | push (xlsx) | `vrs_p_r` | no |\n| Verband Region Stuttgart: Vaihingen | ParkingSite | car | pull | `vrs_vaihingen` | yes |\n\nNew converters for new sources are always welcome, please have a look at \"Contribute\" below.\n\n\n## Install\n\nParkAPI Sources is a python module published at [PyPI](https://pypi.org/project/parkapi-sources/). Therefore, you can install it by\n\n```shell\npip install parkapi-sources\n```\n\nIf you use parkapi-sources in a project, we recommend to fix the version. As long as parkapi-sources is beta, breaking\nchanges might be introduced on minor version level (like: change from 0.1.1 to 0.2.0). As soon as 1.0 is released, we\nwill follow [Semantic Versioning](https://semver.org), which means that breaking changes will just appear on major version changes\n(like: change from 1.1.2 to 2.0.0). You can expect a lot of changes in the minor version level, as any new converter is\na new feature.\n\n\n## Usage\n\nYour starting point is always the `ParkAPISources` where all Sources are registered.\n\n```python\nfrom parkapi_sources import ParkAPISources\n\nmy_sources = ParkAPISources()\n```\n\n`ParkAPISources` accepts following parameters:\n\n- `config: Optional[dict] = None` is a dictionary for config values, especially secrets.\n- `converter_uids: Optional[list[str]] = None` is used for loading just the converter uids you want to load\n- `no_pull_converter: bool = False` is used for limiting converters to pull converters\n- `no_push_converter: bool = False` is used for limiting converters to push converters\n- `custom_converters: list[BaseConverter] = None` is used for additional custom converters\n\n\n### Configuration\n\nConfig values are mostly individual for specific converters: if there are required config values, they are defined at\nthe converter definition right at the top:\n\n```\nrequired_config_keys = ['MY_SECRET']\n```\n\n`ParkAPISources` offers a method to check if all config values are set:\n\n```python\nfrom parkapi_sources import ParkAPISources\n\nmy_sources = ParkAPISources()\nmy_sources.check_credentials()\n```\n\nIf not all config values are set, a `MissingConfigException` is thrown. It's recommended to run this check after\ninitializing the module to prevent exceptions during runtime.\n\nBesides converter-individual config values, there are two global values which can be used to configure the source of\nGeoJSON files. Per default, static GeoJSON files are fetched from this repository. This behaviour can be changed:\n\n- `STATIC_GEOJSON_BASE_URL` defines another base URL for GeoJSON files\n- `STATIC_GEOJSON_BASE_PATH` defines a lokal path instead, so the application will load files locally without network\n requests\n\n\n### Use converters\n\nAfter initializing, you will find all initialized converters at `ParkAPISources.converter_by_uid`. As the input is very\ndifferent, so are the methods you have to use. In general, you can differ between two major strategies,\npull- and push-converters. Also, each converter has to define which data it provides by using the corresponding parent\nclasses, currently `ParkingSite` for parking sites and `ParkingSpot` for parking spots\n\n\n### Pull converters\n\nPull converters are responsible for getting data from an external data source. This can be an REST endpoints as well as\nHTML which is scraped. Pull converters always split up in static and realtime data, because at most sources, this is\nnot the same. Each pull converter has at least a method for static parking sites, or two if it supports realtime data.\nFor `ParkingSite`s, it's\n\n1) `get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:`\n2) `get_realtime_parking_sites(self) -> tuple[list[RealtimeParkingSiteInput], list[ImportParkingSiteException]]:`\n\nFor `ParkingSpot`s, it's\n\n1) `get_static_parking_spots(self) -> tuple[list[StaticParkingSpotInput], list[ImportParkingSpotException]]:`\n2) `get_realtime_parking_spots(self) -> tuple[list[RealtimeParkingSpotInput], list[ImportParkingSpotException]]:`\n\n\n### Push converters\n\nPush converters are responsible to handle data which is pushed to the service using defined endpoints. Usually, these\nconverters are used as a handler behind HTTP endpoints, but of course you can use them in other ways, too, for example\ncommand line scripts.\n\nPush converters always handle specific formats, therefore, there are multiple types of push converters. All push\nconverters return a `tuple[list[StaticParkingSiteInput | RealtimeParkingSiteInput], list[ImportParkingSiteException]]`,\ntherefore they decided based on the given data if it's static or realtime data they got - or even both, then each\ndataset ends up in two entries in the first list.\n\n1) A `CsvConverter` handles CSV files: `handle_csv_string(self, data: StringIO)`\n2) A `JsonConverter` handles JSON based data: `handle_json(self, data: dict | list)`\n3) A `XlsxConverter` handles XMLX data: `def handle_xlsx(self, workbook: Workbook)` parsed by `openpyxl`\n4) A `XmlConverter` handles XML data: `def handle_xml(self, root: Element)` parsed by `lxml`\n\n\n### Results\n\nAt `webapp/models/parking_site_inputs.py`, you can find the definition of `StaticParkingSiteInput` and\n`RealtimeParkingSiteInput`. These `dataclasses` are also [`validataclasses`](https://pypi.org/project/validataclass/), so you can be sure that the data\nyou get is validated.\n\n\n### Patch data with local files\n\nIf `PARK_API_PARKING_SITE_PATCH_DIR` is set, all pull converters will check if there is a JSON file in this directory\ncalled `source_uid.json` (replace `source_uid` with the source you want to patch). It expects a ParkAPI JSON format\nwith `uid` as the only required field. A file might look like this:\n\n```\n{\n \"items\": [\n {\n \"uid\": \"my-uid\",\n \"name\": \"New name\"\n }\n ]\n}\n\n```\n\n\n### Debugging\n\nIn order to debug ParkAPI Sources, there are two config values which can be used to dump all the requests. Before doing\nthis, please keep in mind that this might end into a lot of data on your disk, especially in case of realtime enabled\nsources. Please also keep in mind that dumps will contain credentials in case of sources with credentials, so please\nhandle this data the same way as you handle passwords.\n\nTwo config values are required for enabling debugging:\n\n- `DEBUG_SOURCES` should be a list of source uids which should be debugged\n- `DEBUG_DUMP_DIR` is the directory where the requests get dumped to\n\nSetting these two values will make ParkAPI Sources save all requests in\n`{DEBUG_DUMP_DIR}/{source_uid}/{datetime}-{type}`, with type:\n\n- `metadata`: Metadata like url, http method, http status, request and response headers and request body\n- `response-body`: The response body\n\nWith this data, you can start a deeper analysis why things don't work as expected.\n\n\n## Contribute\n\n### Report bugs\n\nAs ParkAPI-Sources integrates a lot of external data sources, sometimes without a proper definition, converters might\nrun into issues because of changes on the data source side. If you see that happening, please add a bug report at the\n[issues](https://github.com/ParkenDD/parkapi-sources-v3/issues). If possible, please add the data which fails.\n\n\n### Contribute new source data\n\nIf you see a nice data source which is not covered by ParkAPI-sources, you can always create a feature request at our\n[issues](https://github.com/ParkenDD/parkapi-sources-v3/issues). If you do so, please add the data you found, so we can actually build\nthe converter. If possible, please try to find out a licence, too.\n\n\n### Write a new Converters\n\nWe always welcome merge requests with new converters. A merge request should contain the following:\n\n* MIT licenced code\n* A converter which validates the input in a way that the output follows the data model\n* A test with example data to ensure that the converter works with current data\n\n\n## Write a new converter\n\nFirst you have to determine which type of converter you need. If you get the data from an endpoint, you will need a\n`PushConverter`, if you have a file you want to push via HTTP or CLI, you need a `PullConverter`.\n\n\n### Write the converter\n\nIn order to write a converter, you need a directory at `converters`. Please name your directory in a way that it points\nto the actual converter you will write. If it's just one converter, the `uid` is usually the best approach.\n\nAt `converters/your-converter`, you will at least need a `converter.py` and an `__init__.py`. In most cases, you will\nalso need some `validataclasses` you can put in `models.py`. Validation is crucial in this library, because the users\nof this library should not think invalid data. Additionally, if you have very specific new data types, you can write\nnew `validataclass` validators you can usually put in `validators.py`.\n\n\n### Testing the converter\n\nIn order to proof that the validator works, we need to test the basic functionality. In order to do this, we need a\nsnapshot of the data which should be integrated. Common place for this data is\n`tests/converters/data/filename-starting-with-uid.ending`. This data should be used in one or multiple tests (in\nseveral cases two tests, one for static, one for realtime data) stored at `tests/converters/uid_test.py`.\n\nIf you test a `PullConverter`, you will need no mock requests. This can be done using the fantastic\n[`requests_mock`](https://pypi.org/project/requests-mock/) library.\n\nIf you created new validators, these should be tested with different inputs. Usually, `pytest.parametrize` is a nice\napproach to do this.\n\n\n### Migrate a converter\n\nIf you want to migrate a v1 or v2 converter, you can re-use some of the code. There is a paradigm change, though:\n`parkapi-source-v3` enforces a strict validation after transforming the data, while v1 and v2 converters don't.\nParkAPI v1 / v2 converters are always pull converters, so the base class is always `PullConverter`.\n\nInstead of defining `POOL`, you will set `source_info` at the same place. Attributes are almost the same, except for\n`id` was renamed to `uid`, and there is the new attribute `has_realtime_data`, which has to be set.\n\nParkAPI v1 and v2 used two methods for static and realtime data, just as `parkapi-sources-v3`:\n\n- the old static data handling `def get_lot_infos(self) -> List[LotInfo]:` is\n `get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:` in\n `parkapi-sources-v3`.\n- the old realtime data handling`def get_lot_data(self) -> List[LotData]:` is\n `def get_realtime_parking_sites(self) -> tuple[list[RealtimeParkingSiteInput], list[ImportParkingSiteException]]:` in\n `parkapi-sources-v3`.\n\nThe result objects have quite the same idea, too:\n\n- `LotInfo` gets `StaticParkingSiteInput`\n- `LotData` gets `RealtimeParkingSiteInput`\n\nThere's also a helper for scraped content: before, there was `self.request_soup(self.POOL.public_url)` in order to get\na `BeautifulSoup` element. Now, there is a helper mixin called `PullScraperMixin`. You can use it this way:\n\n```\nclass MyPullConverter(PullConverter, PullScraperMixin):\n```\n\nAdditionally, there is another mixin for the GeoJSON files you already know from v1 and v2 converters:\n`StaticGeojsonDataMixin`. Using this, you can just define the static data method this way:\n\n```\n def get_static_parking_sites(self) -> tuple[list[StaticParkingSiteInput], list[ImportParkingSiteException]]:\n return self._get_static_parking_site_inputs_and_exceptions(source_uid=self.source_info.uid)\n```\n\nThe default location for GeoJSON files is a [separate repository](https://github.com/ParkenDD/parkapi-static-data).\n\nPlease keep in mind that you will have to add tests for the migrated scraper.\n\n\n### Linting\n\nAs we try to keep a consistent code style, please lint your code before creating the merge request. We use `ruff` for\nlinting and formatting. There is Makefile target to do both: `make lint`. It runs the following commands:\n\n```bash\nruff format ./src ./tests\nruff check --fix ./src ./tests\n```\n\nIf you don't have `ruff` installed globally, you can create a virtual environment for these tools:\n\n```bash\nvirtualenv venv\nsource venv/bin/activate\npip install -r requirements.txt -r requirements-dev.txt\n\nruff format ./src ./tests\nruff check --fix ./src ./tests\n```\n\n\n### Make your new converter available\n\nAll available converters should be registered at the `ParkAPISources` class in order to make them accessible for users\nof this library, so please register your converter there. The new converter should also be added to the table in this\nREADME.md file.\n\n\n### Release process\n\nIf you created a merge request, the maintainers will review your code. If everything is fine, it will be merged to\n`main`, and a new release will be created soon. As written above, we follow SemVer, so any new converter will add plus\none to the minor version. In order to use this new release, please keep in mind to update your\n`requirements.txt` / update the dependency manager you use.\n\n\n## Licence\n\nThis library is under MIT licence. Please look at `LICENCE.txt` for details.\n",
"bugtrack_url": null,
"license": null,
"summary": "ParkAPI Sources is a collection of converters from several different data sources to normalized ParkAPI data.",
"version": "0.23.0",
"project_urls": {
"Changelog": "https://github.com/ParkenDD/parkapi-sources-v3/blob/main/CHANGELOG.md",
"Homepage": "https://github.com/ParkenDD/parkapi-sources-v3",
"Issues": "https://github.com/ParkenDD/parkapi-sources-v3/issues",
"Repository": "https://github.com/ParkenDD/parkapi-sources-v3.git"
},
"split_keywords": [
"data",
" parking",
" converter",
" mobility",
" car",
" bike"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "81587448272e2518481337918407b6cf581437583da17cbc325d2fc8fe8243ee",
"md5": "349120d7263341a4f4976f1958c39bcf",
"sha256": "5cf77143db5f1b8ec7e646e8061c54f9ca169e5ed33a8117094de32a493d73e7"
},
"downloads": -1,
"filename": "parkapi_sources-0.23.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "349120d7263341a4f4976f1958c39bcf",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 197520,
"upload_time": "2025-07-28T18:23:01",
"upload_time_iso_8601": "2025-07-28T18:23:01.232216Z",
"url": "https://files.pythonhosted.org/packages/81/58/7448272e2518481337918407b6cf581437583da17cbc325d2fc8fe8243ee/parkapi_sources-0.23.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "76f8d8e5a329ba9b7b9947f04e5fe57ec559480bbc1fccd864a31486d8ae65aa",
"md5": "27880e8f522704362e9c3606342e4f25",
"sha256": "d39f4cfefbfbb5cd0f11f526a749c020a04685a2a2d109acda7b05d319cbeaed"
},
"downloads": -1,
"filename": "parkapi_sources-0.23.0.tar.gz",
"has_sig": false,
"md5_digest": "27880e8f522704362e9c3606342e4f25",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 2140186,
"upload_time": "2025-07-28T18:23:02",
"upload_time_iso_8601": "2025-07-28T18:23:02.884811Z",
"url": "https://files.pythonhosted.org/packages/76/f8/d8e5a329ba9b7b9947f04e5fe57ec559480bbc1fccd864a31486d8ae65aa/parkapi_sources-0.23.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-28 18:23:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ParkenDD",
"github_project": "parkapi-sources-v3",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "validataclass",
"specs": [
[
"~=",
"0.11.0"
]
]
},
{
"name": "pyproj",
"specs": [
[
"~=",
"3.7.1"
]
]
},
{
"name": "lxml",
"specs": [
[
"~=",
"5.4.0"
]
]
},
{
"name": "openpyxl",
"specs": [
[
"~=",
"3.1.5"
]
]
},
{
"name": "requests",
"specs": [
[
"~=",
"2.32.4"
]
]
},
{
"name": "beautifulsoup4",
"specs": [
[
"~=",
"4.13.4"
]
]
},
{
"name": "urllib3",
"specs": [
[
"~=",
"2.5.0"
]
]
},
{
"name": "isodate",
"specs": [
[
"~=",
"0.7.2"
]
]
},
{
"name": "shapely",
"specs": [
[
"~=",
"2.1.1"
]
]
}
],
"tox": true,
"lcname": "parkapi-sources"
}