Name | ecopipeline JSON |
Version |
0.8.12
JSON |
| download |
home_page | None |
Summary | Contains functions for use in Ecotope Datapipelines |
upload_time | 2025-07-29 23:04:04 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# DataPipelinePackage
## To Install the Package
From the internet for use elsewhere:
$ pip install ecopipeline
Install locally in an editable mode:
Navigate to DataPipelinePackage directory and run the following command
$ pip install -e .
## Using the Package
See https://ecotoperesearch.github.io/DataPipelinePackage/build/html/index.html for documentation
### config.ini
- database
- user: username for host database connection
- password: password for host database connection
- host: name of host
- database: name of database
- minute
- table_name: name of table to be created in the mySQL database containing minute-by-minute data
- hour
- table_name: name of table to be created in the mySQL database containing hour-by-hour data
- day
- table_name: name of table to be created in the mySQL database containing day-by-day data
- input
- directory: diretory of the folder containing the input files listed below
- site_info: name of the site information csv
- 410a_info: name of the 410a information csv
- superheat_info: name of the superheat infomation csv
- output
- directory: diretory of the folder where any pipeline output should be written to
- data
- directory: diretory of the folder from which extract loads the raw sensor data
- fieldManager_api_usr: Username for Field Manager API if extracting data through that medium
- fieldManager_api_pw: Password for Field Manager API if extracting data through that medium
- fieldManager_device_id: Device ID for Field Manager API if extracting data through that medium
## Unit Testing
To run Unit tests, run the following command in the terminal in the corresponding directory:
```bash
python -m pytest
```
Raw data
{
"_id": null,
"home_page": null,
"name": "ecopipeline",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/14/be/e99c0912774ab60a2b2b8284887fd6bedc14dc10d91a21c1add6c21a83eb/ecopipeline-0.8.12.tar.gz",
"platform": null,
"description": "# DataPipelinePackage\n\n## To Install the Package\n From the internet for use elsewhere:\n $ pip install ecopipeline\n Install locally in an editable mode:\n Navigate to DataPipelinePackage directory and run the following command\n $ pip install -e .\n\n## Using the Package\nSee https://ecotoperesearch.github.io/DataPipelinePackage/build/html/index.html for documentation\n\n### config.ini\n- database\n - user: username for host database connection \n - password: password for host database connection\n - host: name of host \n - database: name of database\n- minute\n - table_name: name of table to be created in the mySQL database containing minute-by-minute data\n- hour\n - table_name: name of table to be created in the mySQL database containing hour-by-hour data\n- day\n - table_name: name of table to be created in the mySQL database containing day-by-day data\n- input\n - directory: diretory of the folder containing the input files listed below\n - site_info: name of the site information csv\n - 410a_info: name of the 410a information csv\n - superheat_info: name of the superheat infomation csv\n- output \n - directory: diretory of the folder where any pipeline output should be written to\n- data\n - directory: diretory of the folder from which extract loads the raw sensor data\n - fieldManager_api_usr: Username for Field Manager API if extracting data through that medium\n - fieldManager_api_pw: Password for Field Manager API if extracting data through that medium\n - fieldManager_device_id: Device ID for Field Manager API if extracting data through that medium\n## Unit Testing\nTo run Unit tests, run the following command in the terminal in the corresponding directory:\n```bash\npython -m pytest\n```\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Contains functions for use in Ecotope Datapipelines",
"version": "0.8.12",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8efa86433869c0e18c79e277eef361a75c847d95552d347d5b0453b403ff0f0a",
"md5": "e18cb75a194c4641f813d6931bae4aae",
"sha256": "90ebd89a7ae484aa34290a95b91c1130a8313e23a8cc2b2bbaeb26487f76d522"
},
"downloads": -1,
"filename": "ecopipeline-0.8.12-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e18cb75a194c4641f813d6931bae4aae",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 54490,
"upload_time": "2025-07-29T23:04:03",
"upload_time_iso_8601": "2025-07-29T23:04:03.332807Z",
"url": "https://files.pythonhosted.org/packages/8e/fa/86433869c0e18c79e277eef361a75c847d95552d347d5b0453b403ff0f0a/ecopipeline-0.8.12-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "14bee99c0912774ab60a2b2b8284887fd6bedc14dc10d91a21c1add6c21a83eb",
"md5": "a2b480d895cfba18579fc050fb157f7b",
"sha256": "153bed758b31669dc0324d57957e05bebee6c1c550fbc6273561bc8879f49b8b"
},
"downloads": -1,
"filename": "ecopipeline-0.8.12.tar.gz",
"has_sig": false,
"md5_digest": "a2b480d895cfba18579fc050fb157f7b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 50893,
"upload_time": "2025-07-29T23:04:04",
"upload_time_iso_8601": "2025-07-29T23:04:04.624040Z",
"url": "https://files.pythonhosted.org/packages/14/be/e99c0912774ab60a2b2b8284887fd6bedc14dc10d91a21c1add6c21a83eb/ecopipeline-0.8.12.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-29 23:04:04",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "ecopipeline"
}