hdfs


Namehdfs JSON
Version 2.7.2 PyPI version JSON
download
home_pagehttps://hdfscli.readthedocs.io
SummaryHdfsCLI: API and command line interface for HDFS.
upload_time2023-08-09 16:19:49
maintainer
docs_urlNone
authorMatthieu Monsch
requires_python
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # HdfsCLI [![CI](https://github.com/mtth/hdfs/actions/workflows/ci.yml/badge.svg)](https://github.com/mtth/hdfs/actions/workflows/ci.yml) [![Pypi badge](https://badge.fury.io/py/hdfs.svg)](https://pypi.python.org/pypi/hdfs/) [![Downloads badge](https://img.shields.io/pypi/dm/hdfs.svg)](https://pypistats.org/packages/hdfs)

API and command line interface for HDFS.

```
$ hdfscli --alias=dev

Welcome to the interactive HDFS python shell.
The HDFS client is available as `CLIENT`.

In [1]: CLIENT.list('models/')
Out[1]: ['1.json', '2.json']

In [2]: CLIENT.status('models/2.json')
Out[2]: {
  'accessTime': 1439743128690,
  'blockSize': 134217728,
  'childrenNum': 0,
  'fileId': 16389,
  'group': 'supergroup',
  'length': 48,
  'modificationTime': 1439743129392,
  'owner': 'drwho',
  'pathSuffix': '',
  'permission': '755',
  'replication': 1,
  'storagePolicy': 0,
  'type': 'FILE'
}

In [3]: with CLIENT.read('models/2.json', encoding='utf-8') as reader:
  ...:     from json import load
  ...:     model = load(reader)
  ...:
```

## Features

* Python 3 bindings for the [WebHDFS][] (and [HttpFS][]) API,
  supporting both secure and insecure clusters.
* Command line interface to transfer files and start an interactive client
  shell, with aliases for convenient namenode URL caching.
* Additional functionality through optional extensions:

  + `avro`, to [read and write Avro files directly from HDFS][].
  + `dataframe`, to [load and save Pandas dataframes][].
  + `kerberos`, to [support Kerberos authenticated clusters][].

See the [documentation][] to learn more.

## Getting started

```sh
$ pip install hdfs
```

Then hop on over to the [quickstart][] guide. A [Conda
feedstock](https://github.com/conda-forge/python-hdfs-feedstock) is also
available.

## Testing

HdfsCLI is tested against both [WebHDFS][] and [HttpFS][]. There are two ways
of running tests (see `scripts/` for helpers to set up a test HDFS cluster):

```sh
$ HDFSCLI_TEST_URL=http://localhost:50070 pytest # Using a namenode's URL.
$ HDFSCLI_TEST_ALIAS=dev pytest # Using an alias.
```

## Contributing

We'd love to hear what you think on the [issues][] page. Pull requests are also
most welcome!

[HttpFS]: http://hadoop.apache.org/docs/current/hadoop-hdfs-httpfs/
[WebHDFS]: http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
[read and write Avro files directly from HDFS]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.avro
[load and save Pandas dataframes]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.dataframe
[support Kerberos authenticated clusters]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.kerberos
[documentation]: https://hdfscli.readthedocs.io/
[quickstart]: https://hdfscli.readthedocs.io/en/latest/quickstart.html
[issues]: https://github.com/mtth/hdfs/issues

            

Raw data

            {
    "_id": null,
    "home_page": "https://hdfscli.readthedocs.io",
    "name": "hdfs",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Matthieu Monsch",
    "author_email": "mtth@apache.org",
    "download_url": "https://files.pythonhosted.org/packages/a8/9f/723bf4d4c92e85562430b21b096fb8baa87ff39cd5d784ddd17df2b0146f/hdfs-2.7.2.tar.gz",
    "platform": null,
    "description": "# HdfsCLI [![CI](https://github.com/mtth/hdfs/actions/workflows/ci.yml/badge.svg)](https://github.com/mtth/hdfs/actions/workflows/ci.yml) [![Pypi badge](https://badge.fury.io/py/hdfs.svg)](https://pypi.python.org/pypi/hdfs/) [![Downloads badge](https://img.shields.io/pypi/dm/hdfs.svg)](https://pypistats.org/packages/hdfs)\n\nAPI and command line interface for HDFS.\n\n```\n$ hdfscli --alias=dev\n\nWelcome to the interactive HDFS python shell.\nThe HDFS client is available as `CLIENT`.\n\nIn [1]: CLIENT.list('models/')\nOut[1]: ['1.json', '2.json']\n\nIn [2]: CLIENT.status('models/2.json')\nOut[2]: {\n  'accessTime': 1439743128690,\n  'blockSize': 134217728,\n  'childrenNum': 0,\n  'fileId': 16389,\n  'group': 'supergroup',\n  'length': 48,\n  'modificationTime': 1439743129392,\n  'owner': 'drwho',\n  'pathSuffix': '',\n  'permission': '755',\n  'replication': 1,\n  'storagePolicy': 0,\n  'type': 'FILE'\n}\n\nIn [3]: with CLIENT.read('models/2.json', encoding='utf-8') as reader:\n  ...:     from json import load\n  ...:     model = load(reader)\n  ...:\n```\n\n## Features\n\n* Python 3 bindings for the [WebHDFS][] (and [HttpFS][]) API,\n  supporting both secure and insecure clusters.\n* Command line interface to transfer files and start an interactive client\n  shell, with aliases for convenient namenode URL caching.\n* Additional functionality through optional extensions:\n\n  + `avro`, to [read and write Avro files directly from HDFS][].\n  + `dataframe`, to [load and save Pandas dataframes][].\n  + `kerberos`, to [support Kerberos authenticated clusters][].\n\nSee the [documentation][] to learn more.\n\n## Getting started\n\n```sh\n$ pip install hdfs\n```\n\nThen hop on over to the [quickstart][] guide. A [Conda\nfeedstock](https://github.com/conda-forge/python-hdfs-feedstock) is also\navailable.\n\n## Testing\n\nHdfsCLI is tested against both [WebHDFS][] and [HttpFS][]. There are two ways\nof running tests (see `scripts/` for helpers to set up a test HDFS cluster):\n\n```sh\n$ HDFSCLI_TEST_URL=http://localhost:50070 pytest # Using a namenode's URL.\n$ HDFSCLI_TEST_ALIAS=dev pytest # Using an alias.\n```\n\n## Contributing\n\nWe'd love to hear what you think on the [issues][] page. Pull requests are also\nmost welcome!\n\n[HttpFS]: http://hadoop.apache.org/docs/current/hadoop-hdfs-httpfs/\n[WebHDFS]: http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html\n[read and write Avro files directly from HDFS]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.avro\n[load and save Pandas dataframes]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.dataframe\n[support Kerberos authenticated clusters]: https://hdfscli.readthedocs.io/en/latest/api.html#module-hdfs.ext.kerberos\n[documentation]: https://hdfscli.readthedocs.io/\n[quickstart]: https://hdfscli.readthedocs.io/en/latest/quickstart.html\n[issues]: https://github.com/mtth/hdfs/issues\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "HdfsCLI: API and command line interface for HDFS.",
    "version": "2.7.2",
    "project_urls": {
        "Homepage": "https://hdfscli.readthedocs.io"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a89f723bf4d4c92e85562430b21b096fb8baa87ff39cd5d784ddd17df2b0146f",
                "md5": "2e163057cc00a78f00e4d5800ec3b49c",
                "sha256": "829b659f4ed8f114923690f5b1aa51d295e590bb25eb59ffee00acc1ce493a93"
            },
            "downloads": -1,
            "filename": "hdfs-2.7.2.tar.gz",
            "has_sig": false,
            "md5_digest": "2e163057cc00a78f00e4d5800ec3b49c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 43420,
            "upload_time": "2023-08-09T16:19:49",
            "upload_time_iso_8601": "2023-08-09T16:19:49.176880Z",
            "url": "https://files.pythonhosted.org/packages/a8/9f/723bf4d4c92e85562430b21b096fb8baa87ff39cd5d784ddd17df2b0146f/hdfs-2.7.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-09 16:19:49",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "hdfs"
}
        
Elapsed time: 0.11057s