Name | hapiutils JSON |
Version |
0.1
JSON |
| download |
home_page | None |
Summary | Additional hapiclient tools including merge, etc |
upload_time | 2024-08-12 15:21:40 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | BSD-3-Clause |
keywords |
hapi
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# tools-python
Additional tools to support hapiclient, including merge, etc. Currently consists of 'merge_hapi()' and 'hapi_to_df()'. Cloud HAPI coming soon. Assuming you have fetched 2 HAPI datasets and their metadata using hapiclient ( dataA, metaA and dataB, metaB):
merged_data, merged_meta = hapiutils.merge_hapi(dataA, metaA, dataB, metaB, round_to_sec=True)
Useful to feed into hapi-nn (see https://github.com/hapi-server/application-neuralnetwork-python)
Also, you can cast HAPI data into a Pandas dataframe with:
hapidf = hapi_to_df(dataA)
Full example:
from hapiclient import hapi
from hapiplot import hapiplot
import hapiutils
opts = {'logging': False, 'usecache': True, 'cachedir': './hapicache' }
start = '2013-01-01T00:00:54Z'
stop = '2013-01-01T06:00:54.000Z'
serverA, datasetA, parametersA = 'https://cdaweb.gsfc.nasa.gov/hapi', 'OMNI2_H0_MRG1HR', 'DST1800'
serverB, datasetB, parametersB = "https://imag-data.bgs.ac.uk/GIN_V1/hapi", "cki/best-avail/PT1M/hdzf", "Field_Vector"
dataA, metaA = hapi(serverA, datasetA, parametersA, start, stop, **opts)
dataB, metaB = hapi(serverB, datasetB, parametersB, start, stop, **opts)
newAB, metaAB = hapiutils.merge_hapi(dataA, metaA, dataB, metaB, True)
hapiplot(newAB, metaAB)
Raw data
{
"_id": null,
"home_page": null,
"name": "hapiutils",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "HAPI",
"author": null,
"author_email": "Johns Hopkins University Applied Physics Laboratory LLC <sandy.antunes@jhuapl.edu>",
"download_url": "https://files.pythonhosted.org/packages/70/50/aa82ea551ebf1c8024d47efb4f9128b0cebcaa59b6603e22d02ae6c37e1d/hapiutils-0.1.tar.gz",
"platform": null,
"description": "# tools-python\nAdditional tools to support hapiclient, including merge, etc. Currently consists of 'merge_hapi()' and 'hapi_to_df()'. Cloud HAPI coming soon. Assuming you have fetched 2 HAPI datasets and their metadata using hapiclient ( dataA, metaA and dataB, metaB):\n\n merged_data, merged_meta = hapiutils.merge_hapi(dataA, metaA, dataB, metaB, round_to_sec=True)\n\nUseful to feed into hapi-nn (see https://github.com/hapi-server/application-neuralnetwork-python)\n\nAlso, you can cast HAPI data into a Pandas dataframe with:\n\n hapidf = hapi_to_df(dataA)\n\nFull example:\n\n from hapiclient import hapi\n from hapiplot import hapiplot\n import hapiutils\n\n opts = {'logging': False, 'usecache': True, 'cachedir': './hapicache' }\n start = '2013-01-01T00:00:54Z'\n stop = '2013-01-01T06:00:54.000Z'\n serverA, datasetA, parametersA = 'https://cdaweb.gsfc.nasa.gov/hapi', 'OMNI2_H0_MRG1HR', 'DST1800'\n serverB, datasetB, parametersB = \"https://imag-data.bgs.ac.uk/GIN_V1/hapi\", \"cki/best-avail/PT1M/hdzf\", \"Field_Vector\"\n\n dataA, metaA = hapi(serverA, datasetA, parametersA, start, stop, **opts)\n dataB, metaB = hapi(serverB, datasetB, parametersB, start, stop, **opts)\n newAB, metaAB = hapiutils.merge_hapi(dataA, metaA, dataB, metaB, True)\n hapiplot(newAB, metaAB)\n",
"bugtrack_url": null,
"license": "BSD-3-Clause",
"summary": "Additional hapiclient tools including merge, etc",
"version": "0.1",
"project_urls": {
"Documentation": "https://github.com/hapi-server/tools-python",
"Homepage": "https://hapi-server.org",
"Repository": "https://github.com/hapi-server/tools-python"
},
"split_keywords": [
"hapi"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ec0dd70c3c5963dfd00c42bd51042c419b07ca2c20a5232a6aba59f9df61ba61",
"md5": "dc1e51bcaa3f8b0887fa167e0c6e31b3",
"sha256": "bd94bb642d6c111575352e6a591add9af7d0c628f8fafc822bc6681cbc82d651"
},
"downloads": -1,
"filename": "hapiutils-0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "dc1e51bcaa3f8b0887fa167e0c6e31b3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 4342,
"upload_time": "2024-08-12T15:21:39",
"upload_time_iso_8601": "2024-08-12T15:21:39.374151Z",
"url": "https://files.pythonhosted.org/packages/ec/0d/d70c3c5963dfd00c42bd51042c419b07ca2c20a5232a6aba59f9df61ba61/hapiutils-0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7050aa82ea551ebf1c8024d47efb4f9128b0cebcaa59b6603e22d02ae6c37e1d",
"md5": "c403b8084c0e20864d54c2779dc2a52e",
"sha256": "d3c9504b0991ffc0b99acde1f4be5d6d912eb4273ef2ac021d84340225880cfa"
},
"downloads": -1,
"filename": "hapiutils-0.1.tar.gz",
"has_sig": false,
"md5_digest": "c403b8084c0e20864d54c2779dc2a52e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 4940,
"upload_time": "2024-08-12T15:21:40",
"upload_time_iso_8601": "2024-08-12T15:21:40.856186Z",
"url": "https://files.pythonhosted.org/packages/70/50/aa82ea551ebf1c8024d47efb4f9128b0cebcaa59b6603e22d02ae6c37e1d/hapiutils-0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-12 15:21:40",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "hapi-server",
"github_project": "tools-python",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "hapiutils"
}