| Name | groundhog-hpc JSON |
| Version |
0.3.1
JSON |
| download |
| home_page | None |
| Summary | Iterative HPC function development. As many 'first tries' as you need. |
| upload_time | 2025-10-27 14:29:48 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | >=3.10 |
| license | MIT |
| keywords |
faas
function serving
globus compute
hpc
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
Home of `hog` ☀️🦫🕳️
## Quickstart
Groundhog makes it easy to run, tweak, and re-run python functions on HPC clusters via [Globus Compute](https://www.globus.org/compute) using simple decorators.
Groundhog automatically manages remote environments (powered by [uv](https://docs.astral.sh/uv/))—just update Python versions or dependencies in your script, no SSH needed.
**Key concepts:**
- `@hog.function()` - Configures a function to run on a Globus Compute endpoint. Decorator kwargs (like `endpoint`, `account`) become the default `user_endpoint_config`.
- `@hog.harness()` - Marks a local entry point that orchestrates remote calls via `.remote()` or `.submit()`.
- The desired remote Python environment (version and dependencies) is specified alongside your code via [PEP 723](https://peps.python.org/pep-0723/) metadata.
```python
# /// script
# requires-python = ">=3.10"
# dependencies = ["numpy"]
# ///
import groundhog_hpc as hog
@hog.function(endpoint="your-endpoint-id", account="your-account")
def compute(x: int) -> int:
import numpy as np
return int(np.sum(range(x)))
@hog.harness()
def main():
result = compute.remote(100)
print(result)
```
Run with: `hog run myscript.py main`
---
see also: [examples/README.md](./examples/README.md)
Raw data
{
"_id": null,
"home_page": null,
"name": "groundhog-hpc",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "FaaS, Function Serving, Globus Compute, HPC",
"author": null,
"author_email": "Owen Price Skelly <OwenPriceSkelly@uchicago.edu>",
"download_url": "https://files.pythonhosted.org/packages/db/86/e81cb4ef8eeddf25b60eb321a1d85214b24b0be48cd2f73a014359ff4264/groundhog_hpc-0.3.1.tar.gz",
"platform": null,
"description": "Home of `hog` \u2600\ufe0f\ud83e\uddab\ud83d\udd73\ufe0f\n\n## Quickstart\n\nGroundhog makes it easy to run, tweak, and re-run python functions on HPC clusters via [Globus Compute](https://www.globus.org/compute) using simple decorators.\n\nGroundhog automatically manages remote environments (powered by [uv](https://docs.astral.sh/uv/))\u2014just update Python versions or dependencies in your script, no SSH needed.\n\n**Key concepts:**\n- `@hog.function()` - Configures a function to run on a Globus Compute endpoint. Decorator kwargs (like `endpoint`, `account`) become the default `user_endpoint_config`.\n- `@hog.harness()` - Marks a local entry point that orchestrates remote calls via `.remote()` or `.submit()`.\n- The desired remote Python environment (version and dependencies) is specified alongside your code via [PEP 723](https://peps.python.org/pep-0723/) metadata.\n\n```python\n# /// script\n# requires-python = \">=3.10\"\n# dependencies = [\"numpy\"]\n# ///\n\nimport groundhog_hpc as hog\n\n@hog.function(endpoint=\"your-endpoint-id\", account=\"your-account\")\ndef compute(x: int) -> int:\n import numpy as np\n return int(np.sum(range(x)))\n\n@hog.harness()\ndef main():\n result = compute.remote(100)\n print(result)\n```\n\nRun with: `hog run myscript.py main`\n\n---\n\nsee also: [examples/README.md](./examples/README.md)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Iterative HPC function development. As many 'first tries' as you need.",
"version": "0.3.1",
"project_urls": null,
"split_keywords": [
"faas",
" function serving",
" globus compute",
" hpc"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "59fee48c375fea7d9e39bbc6ec3d718672e5b84c8067b10e725fdaf3e131a6a1",
"md5": "ffe5ba93490b4ca9a4b355acc1e752ed",
"sha256": "468d7fd345c84748870c3c56588a5dab5c8198af3b1ddc7a3f992c73db8a6c08"
},
"downloads": -1,
"filename": "groundhog_hpc-0.3.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ffe5ba93490b4ca9a4b355acc1e752ed",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 30623,
"upload_time": "2025-10-27T14:29:47",
"upload_time_iso_8601": "2025-10-27T14:29:47.263877Z",
"url": "https://files.pythonhosted.org/packages/59/fe/e48c375fea7d9e39bbc6ec3d718672e5b84c8067b10e725fdaf3e131a6a1/groundhog_hpc-0.3.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "db86e81cb4ef8eeddf25b60eb321a1d85214b24b0be48cd2f73a014359ff4264",
"md5": "63c90b752d1ac0e39017e2b60839311b",
"sha256": "b363f5f9330a22337358455497d1d36ea1ff2381fd14288f265ec7043a837565"
},
"downloads": -1,
"filename": "groundhog_hpc-0.3.1.tar.gz",
"has_sig": false,
"md5_digest": "63c90b752d1ac0e39017e2b60839311b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 120624,
"upload_time": "2025-10-27T14:29:48",
"upload_time_iso_8601": "2025-10-27T14:29:48.155304Z",
"url": "https://files.pythonhosted.org/packages/db/86/e81cb4ef8eeddf25b60eb321a1d85214b24b0be48cd2f73a014359ff4264/groundhog_hpc-0.3.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-27 14:29:48",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "groundhog-hpc"
}