# SHUI
Spark-Hadoop Unix Installer
![OSes](https://img.shields.io/badge/system-macOS%7CLinux%7CFreeBSD-green)
![Architectures](https://img.shields.io/badge/arch-i686%7Cx86__64-yellowgreen)
[![Python](https://img.shields.io/pypi/pyversions/shui.svg?logo=python&logoColor=white)](https://pypi.org/project/shui)
[![PyPI version](https://badge.fury.io/py/shui.svg)](https://badge.fury.io/py/shui)
[![PyPI downloads](https://img.shields.io/pypi/dm/shui)](https://img.shields.io/pypi/dm/shui)
[![Code style](https://github.com/jemrobinson/shui/actions/workflows/check-code-formatting.yaml/badge.svg)](https://github.com/jemrobinson/shui/actions/workflows/check-code-formatting.yaml)
[![Functionality](https://github.com/jemrobinson/shui/actions/workflows/test-functionality.yaml/badge.svg)](https://github.com/jemrobinson/shui/actions/workflows/test-functionality.yaml)
This package uses Python to download and unpack a pre-built version of Spark/Hadoop from Apache.
Its primary use-case is simplifying unattended installs where the user wants "the latest available version" of these tools.
## Features
- download Spark/Hadoop release tarball from Apache.
- verify the tarball using the SHA512 sum provided by Apache.
- unpack the tarball to a target directory on your local system.
## Installation
First you'll need to install `shui` using pip: `pip install shui`.
## Usage
### Versions
The `versions` command shows you all available Spark/Hadoop versions.
```
USAGE
shui versions [--latest]
OPTIONS
--latest Show only the latest available version
GLOBAL OPTIONS
-h (--help) Display this help message
-q (--quiet) Do not output any message
-v (--verbose) Increase the verbosity of messages: "-v" for normal output, "-vv" for more verbose output and "-vvv" for debug
-V (--version) Display this application version
--ansi Force ANSI output
--no-ansi Disable ANSI output
-n (--no-interaction) Do not ask any interactive question
```
### Install
The `install` command will download, verify and install a particular Spark/Hadoop version.
```
USAGE
shui install [--latest] [--spark <...>] [--hadoop <...>] [--target <...>]
OPTIONS
--latest Use the latest available version
--spark Spark version (default: "any")
--hadoop Hadoop version (default: "any")
--target Directory to install into (default: "cwd")
GLOBAL OPTIONS
-h (--help) Display this help message
-q (--quiet) Do not output any message
-v (--verbose) Increase the verbosity of messages: "-v" for normal output, "-vv" for more verbose output and "-vvv" for debug
-V (--version) Display this application version
--ansi Force ANSI output
--no-ansi Disable ANSI output
-n (--no-interaction) Do not ask any interactive question
```
Raw data
{
"_id": null,
"home_page": "https://github.com/jemrobinson/shui",
"name": "shui",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8,<4.0",
"maintainer_email": "",
"keywords": "Spark,Hadoop,installation",
"author": "James Robinson",
"author_email": "james.em.robinson@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/7a/13/7629b7f32042aae69c96ab60e0d357bcb7f06519bac6026d2794c592b519/shui-0.8.1.tar.gz",
"platform": null,
"description": "# SHUI\nSpark-Hadoop Unix Installer\n\n![OSes](https://img.shields.io/badge/system-macOS%7CLinux%7CFreeBSD-green)\n![Architectures](https://img.shields.io/badge/arch-i686%7Cx86__64-yellowgreen)\n\n[![Python](https://img.shields.io/pypi/pyversions/shui.svg?logo=python&logoColor=white)](https://pypi.org/project/shui)\n[![PyPI version](https://badge.fury.io/py/shui.svg)](https://badge.fury.io/py/shui)\n[![PyPI downloads](https://img.shields.io/pypi/dm/shui)](https://img.shields.io/pypi/dm/shui)\n[![Code style](https://github.com/jemrobinson/shui/actions/workflows/check-code-formatting.yaml/badge.svg)](https://github.com/jemrobinson/shui/actions/workflows/check-code-formatting.yaml)\n[![Functionality](https://github.com/jemrobinson/shui/actions/workflows/test-functionality.yaml/badge.svg)](https://github.com/jemrobinson/shui/actions/workflows/test-functionality.yaml)\n\nThis package uses Python to download and unpack a pre-built version of Spark/Hadoop from Apache.\nIts primary use-case is simplifying unattended installs where the user wants \"the latest available version\" of these tools.\n\n## Features\n\n- download Spark/Hadoop release tarball from Apache.\n- verify the tarball using the SHA512 sum provided by Apache.\n- unpack the tarball to a target directory on your local system.\n\n## Installation\n\nFirst you'll need to install `shui` using pip: `pip install shui`.\n\n## Usage\n\n### Versions\nThe `versions` command shows you all available Spark/Hadoop versions.\n\n```\nUSAGE\n shui versions [--latest]\n\nOPTIONS\n --latest Show only the latest available version\n\nGLOBAL OPTIONS\n -h (--help) Display this help message\n -q (--quiet) Do not output any message\n -v (--verbose) Increase the verbosity of messages: \"-v\" for normal output, \"-vv\" for more verbose output and \"-vvv\" for debug\n -V (--version) Display this application version\n --ansi Force ANSI output\n --no-ansi Disable ANSI output\n -n (--no-interaction) Do not ask any interactive question\n```\n\n### Install\nThe `install` command will download, verify and install a particular Spark/Hadoop version.\n\n```\nUSAGE\n shui install [--latest] [--spark\u00a0<...>] [--hadoop\u00a0<...>] [--target\u00a0<...>]\n\nOPTIONS\n --latest Use the latest available version\n --spark Spark version (default: \"any\")\n --hadoop Hadoop version (default: \"any\")\n --target Directory to install into (default: \"cwd\")\n\nGLOBAL OPTIONS\n -h (--help) Display this help message\n -q (--quiet) Do not output any message\n -v (--verbose) Increase the verbosity of messages: \"-v\" for normal output, \"-vv\" for more verbose output and \"-vvv\" for debug\n -V (--version) Display this application version\n --ansi Force ANSI output\n --no-ansi Disable ANSI output\n -n (--no-interaction) Do not ask any interactive question\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Spark-Hadoop Unix Installer",
"version": "0.8.1",
"project_urls": {
"Homepage": "https://github.com/jemrobinson/shui",
"Repository": "https://github.com/jemrobinson/shui"
},
"split_keywords": [
"spark",
"hadoop",
"installation"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2acf18506c3ae285928734eecdb7b37caf059f1c0a99183660c8d11b90eb31e1",
"md5": "aa685bad1512ccaf7d515cac760751de",
"sha256": "f9bab7cf775f42e38af6ff546d924eec4bcdaf5d9798c1cc3033bd0efb8429a2"
},
"downloads": -1,
"filename": "shui-0.8.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "aa685bad1512ccaf7d515cac760751de",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8,<4.0",
"size": 11297,
"upload_time": "2024-03-11T17:20:15",
"upload_time_iso_8601": "2024-03-11T17:20:15.665759Z",
"url": "https://files.pythonhosted.org/packages/2a/cf/18506c3ae285928734eecdb7b37caf059f1c0a99183660c8d11b90eb31e1/shui-0.8.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7a137629b7f32042aae69c96ab60e0d357bcb7f06519bac6026d2794c592b519",
"md5": "6b924051e6a5e7a0ecce33dd67934eaa",
"sha256": "dded30fa40be9c5c79d2d0b69a8b970b57aee9ab5c1bea31cb72b74e37702b6e"
},
"downloads": -1,
"filename": "shui-0.8.1.tar.gz",
"has_sig": false,
"md5_digest": "6b924051e6a5e7a0ecce33dd67934eaa",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8,<4.0",
"size": 8578,
"upload_time": "2024-03-11T17:20:17",
"upload_time_iso_8601": "2024-03-11T17:20:17.305333Z",
"url": "https://files.pythonhosted.org/packages/7a/13/7629b7f32042aae69c96ab60e0d357bcb7f06519bac6026d2794c592b519/shui-0.8.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-03-11 17:20:17",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "jemrobinson",
"github_project": "shui",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "shui"
}