azure-datalake-store


Nameazure-datalake-store JSON
Version 0.0.53 PyPI version JSON
download
home_pagehttps://github.com/Azure/azure-data-lake-store-python
SummaryAzure Data Lake Store Filesystem Client Library for Python
upload_time2023-05-10 21:17:05
maintainer
docs_urlNone
authorMicrosoft Corporation
requires_python
licenseMIT License
keywords azure
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage
            Microsoft Azure Data Lake Store Filesystem Library for Python
=============================================================

.. image:: https://travis-ci.org/Azure/azure-data-lake-store-python.svg?branch=dev
    :target: https://travis-ci.org/Azure/azure-data-lake-store-python
.. image:: https://coveralls.io/repos/github/Azure/azure-data-lake-store-python/badge.svg?branch=master
    :target: https://coveralls.io/github/Azure/azure-data-lake-store-python?branch=master

This project is the Python filesystem library for Azure Data Lake Store.

INSTALLATION
============

To install from source instead of pip (for local testing and development):

.. code-block:: bash

    > pip install -r dev_requirements.txt
    > python setup.py develop

Usage: Sample Code
==================

To play with the code, here is a starting point:

.. code-block:: python

    from azure.datalake.store import core, lib, multithread
    token = lib.auth(tenant_id, username, password)
    adl = core.AzureDLFileSystem(token, store_name=store_name)

    # typical operations
    adl.ls('')
    adl.ls('tmp/', detail=True)
    adl.ls('tmp/', detail=True, invalidate_cache=True)
    adl.cat('littlefile')
    adl.head('gdelt20150827.csv')

    # file-like object
    with adl.open('gdelt20150827.csv', blocksize=2**20) as f:
        print(f.readline())
        print(f.readline())
        print(f.readline())
        # could have passed f to any function requiring a file object:
        # pandas.read_csv(f)

    with adl.open('anewfile', 'wb') as f:
        # data is written on flush/close, or when buffer is bigger than
        # blocksize
        f.write(b'important data')

    adl.du('anewfile')

    # recursively download the whole directory tree with 5 threads and
    # 16MB chunks
    multithread.ADLDownloader(adl, "", 'my_temp_dir', 5, 2**24)

Progress can be tracked using a callback function in the form `track(current, total)`
When passed, this will keep track of transferred bytes and be called on each complete chunk.

Here's an example using the Azure CLI progress controller as the `progress_callback`:

.. code-block:: python

    from cli.core.application import APPLICATION

    def _update_progress(current, total):
        hook = APPLICATION.get_progress_controller(det=True)
        hook.add(message='Alive', value=current, total_val=total)
        if total == current:
            hook.end()

    ...
    ADLUploader(client, destination_path, source_path, thread_count, overwrite=overwrite,
            chunksize=chunk_size,
            buffersize=buffer_size,
            blocksize=block_size,
            progress_callback=_update_progress)

This will output a progress bar to the stdout:

.. code-block:: bash

    Alive[#########################                                       ]  40.0881%
    
    Finished[#############################################################]  100.0000%

Usage: Command Line Sample
==========================

To interact with the API at a higher-level, you can use the provided
command-line interface in "samples/cli.py". You will need to set
the appropriate environment variables 

* :code:`azure_username`

* :code:`azure_password`

* :code:`azure_data_lake_store_name`

* :code:`azure_subscription_id`

* :code:`azure_resource_group_name`

* :code:`azure_service_principal`

* :code:`azure_service_principal_secret`

to connect to the Azure Data Lake Store. Optionally, you may need to define :code:`azure_tenant_id` or :code:`azure_data_lake_store_url_suffix`.

Below is a simple sample, with more details beyond.


.. code-block:: bash

    python samples\cli.py ls -l

Execute the program without arguments to access documentation.

To start the CLI in interactive mode, run "python samples/cli.py"
and then type "help" to see all available commands (similiar to Unix utilities):

.. code-block:: bash

    > python samples/cli.py
    azure> help

    Documented commands (type help <topic>):
    ========================================
    cat    chmod  close  du      get   help  ls     mv   quit  rmdir  touch
    chgrp  chown  df     exists  head  info  mkdir  put  rm    tail

    azure>


While still in interactive mode, you can run "ls -l" to list the entries in the
home directory ("help ls" will show the command's usage details). If you're not
familiar with the Unix/Linux "ls" command, the columns represent 1) permissions,
2) file owner, 3) file group, 4) file size, 5-7) file's modification time, and
8) file name.

.. code-block:: bash

    > python samples/cli.py
    azure> ls -l
    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1
    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv
    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv
    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp
    azure> ls -l --human-readable
    drwxrwx--- 0123abcd 0123abcd   0B Aug 02 12:44 azure1
    -rwxrwx--- 0123abcd 0123abcd   1M Jul 25 18:33 abc.csv
    -r-xr-xr-x 0123abcd 0123abcd  36B Jul 22 18:32 xyz.csv
    drwxrwx--- 0123abcd 0123abcd   0B Aug 03 13:46 tmp
    azure>


To download a remote file, run "get remote-file [local-file]". The second
argument, "local-file", is optional. If not provided, the local file will be
named after the remote file minus the directory path.

.. code-block:: bash

    > python samples/cli.py
    azure> ls -l
    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1
    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv
    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv
    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp
    azure> get xyz.csv
    2016-08-04 18:57:48,603 - ADLFS - DEBUG - Creating empty file xyz.csv
    2016-08-04 18:57:48,604 - ADLFS - DEBUG - Fetch: xyz.csv, 0-36
    2016-08-04 18:57:49,726 - ADLFS - DEBUG - Downloaded to xyz.csv, byte offset 0
    2016-08-04 18:57:49,734 - ADLFS - DEBUG - File downloaded (xyz.csv -> xyz.csv)
    azure>


It is also possible to run in command-line mode, allowing any available command
to be executed separately without remaining in the interpreter.

For example, listing the entries in the home directory:

.. code-block:: bash

    > python samples/cli.py ls -l
    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1
    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv
    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv
    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp
    >


Also, downloading a remote file:

.. code-block:: bash

    > python samples/cli.py get xyz.csv
    2016-08-04 18:57:48,603 - ADLFS - DEBUG - Creating empty file xyz.csv
    2016-08-04 18:57:48,604 - ADLFS - DEBUG - Fetch: xyz.csv, 0-36
    2016-08-04 18:57:49,726 - ADLFS - DEBUG - Downloaded to xyz.csv, byte offset 0
    2016-08-04 18:57:49,734 - ADLFS - DEBUG - File downloaded (xyz.csv -> xyz.csv)
    >

Tests
=====

For detailed documentation about our test framework, please visit the 
`tests folder <https://github.com/Azure/azure-data-lake-store-python/tree/master/tests>`__.

Need Help?
==========

Be sure to check out the Microsoft Azure `Developer Forums on Stack Overflow <http://go.microsoft.com/fwlink/?LinkId=234489>`__
if you have trouble with the provided code. Most questions are tagged `azure and python <https://stackoverflow.com/questions/tagged/azure+python>`__.


Contribute Code or Provide Feedback
===================================

If you would like to become an active contributor to this project please
follow the instructions provided in `Microsoft Azure Projects Contribution Guidelines <http://azure.github.io/guidelines/>`__. 
Furthermore, check out `GUIDANCE.md <https://github.com/Azure/azure-data-lake-store-python/blob/master/GUIDANCE.md>`__ 
for specific information related to this project.

If you encounter any bugs with the library please file an issue in the
`Issues <https://github.com/Azure/azure-data-lake-store-python/issues>`__
section of the project.


Code of Conduct
===============
This project has adopted the `Microsoft Open Source Code of Conduct <https://opensource.microsoft.com/codeofconduct/>`__. 
For more information see the `Code of Conduct FAQ <https://opensource.microsoft.com/codeofconduct/faq/>`__ or contact 
`opencode@microsoft.com <mailto:opencode@microsoft.com>`__ with any additional questions or comments.


.. :changelog:

Release History
===============

0.0.53 (2023-04-11)
+++++++++++++++++++
* Add MSAL support. Remove ADAL support
* Suppress deprecation warning when detecting pyopenssl existence.

0.0.52 (2020-11-25)
+++++++++++++++++++
* Changed logging verbosity when closing a stream
* Filter out default acl for files when using recursive acl operations

0.0.51 (2020-10-15)
+++++++++++++++++++
* Add more logging for zero byte reads to investigate root cause.

0.0.50 (2020-09-10)
+++++++++++++++++++
* Fix bug with retrying for ADAL exception parsing.

0.0.49 (2020-08-05)
+++++++++++++++++++
* Fix bug with NoRetryPolicy
* Remove Python 3.4,5 in travis configuration.
* Fix logging for unicode

0.0.48 (2019-10-18)
+++++++++++++++++++
* Buffer writes to prevent out of memory problems
* Add Python 3.7 in travis configuration

0.0.47 (2019-08-14)
+++++++++++++++++++
* Remove logging of bearer token
* Documentation related changes(Add readme.md and correct some formatting)

0.0.46 (2019-06-25)
+++++++++++++++++++
* Expose per request timeout. Default to 60.
* Concat will not retry by default.
* Bug fixes.

0.0.45 (2019-05-10)
+++++++++++++++++++
* Update open and close ADLFile semantics
* Refactor code and improve performance for opening a file

0.0.44 (2019-03-05)
+++++++++++++++++++
* Add continuation token to LISTSTATUS api call
* Update api-version to 2018-09-01

0.0.43 (2019-03-01)
+++++++++++++++++++
* Fix bug in downloader when glob returns a single file

0.0.42 (2019-02-26)
+++++++++++++++++++
* Update docstrings
* Remove logging setlevel to DEBUG for recursive acl operations

0.0.41 (2019-01-31)
+++++++++++++++++++
* Remove GetContentSummary api call
* Move check_token() under retry block
* Expose timeout parameter for AdlDownloader and AdlUploader
* Raise an exception instead of silently break for zero length reads

0.0.40 (2019-01-08)
+++++++++++++++++++
* Fix zero length read
* Remove dependence on custom wheel and conform to PEP 420

0.0.39 (2018-11-14)
+++++++++++++++++++
* Fix for Chunked Decoding exception thrown while reading response.content

0.0.38 (2018-11-12)
+++++++++++++++++++
* Added support for recursive acl functions
* Fixed bug due to missing filesessionid in get_chunk

0.0.37 (2018-11-02)
+++++++++++++++++++
* Reverted some changes introduced in 0.0.35 that didn't work with other tokens

0.0.36 (2018-10-31)
+++++++++++++++++++
* Fixed typo in refresh_token call

0.0.35 (2018-10-29)
+++++++++++++++++++
* Added retry for getting tokens
* Added requests>=2.20 because of CVE 2018-18074
* Fixed test parameters and updated test recordings

0.0.34 (2018-10-15)
+++++++++++++++++++
* Fixed concat issue with plus(or other symbols) in filename
* Added readinto method
* Changed api-version to 2018-05-01 for all.

0.0.32 (2018-10-04)
+++++++++++++++++++
* Fixed test bug
* Fixed empty folder upload bug
* Fixed ADL Downloader block size bug

0.0.31 (2018-09-10)
+++++++++++++++++++
* Added support for batched ls

0.0.30 (2018-08-28)
+++++++++++++++++++
* Fixed .travis.yml order to add azure-nspg dependency

0.0.29 (2018-08-22)
+++++++++++++++++++
* Fixed HISTORY.rst and Pypi documentation

0.0.28 (2018-08-20)
+++++++++++++++++++
* Added recovery from DatalakeBadOffsetException

0.0.27 (2018-08-08)
+++++++++++++++++++
* Fixed bug in single file check
* Added Python2 exception compatibility

0.0.26 (2018-08-03)
+++++++++++++++++++
* Fixed bug due to not importing errno
* Fixed bug in os.makedirs race condition
* Updated Readme with correct environment variables and fixed some links

0.0.25 (2018-07-26)
+++++++++++++++++++
* Fixed downloading of empty directories and download of directory structure with only a single file

0.0.24 (2018-07-16)
+++++++++++++++++++
* Retry policy implemented for all operations, default being Exponential Retry Policy

0.0.23 (2018-07-11)
+++++++++++++++++++
* Fixed the incorrect download location in case of UNC local paths

0.0.22 (2018-06-02)
+++++++++++++++++++
* Encoding filepaths in URI

0.0.21 (2018-06-01)
+++++++++++++++++++
* Remove unused msrest dependency

0.0.20 (2018-05-25)
+++++++++++++++++++
* Compatibility of the sdist with wheel 0.31.0

0.0.19 (2018-03-14)
-------------------
* Fixed upload issue where destination filename was wrong while upload of directory with single file #208

0.0.18 (2018-02-05)
-------------------
* Fixed read issue where whole file was cached while doing positional reads #198
* Fixed readline as well for the same

0.0.17 (2017-09-21)
-------------------
* Fixed README.rst indentation error
* Changed management endpoint

0.0.16 (2017-09-11)
-------------------
* Fixed Multi chunk transfer hangs as merging chunks fails #187
* Added syncflag and leaseid in create, append calls.
* Added filesessionid in create, append and open calls.

0.0.15 (2017-07-26)
-------------------
* Enable Data Lake Store progress controller callback #174
* Fix File state incorrectly marked as "errored" if contains chunks is "pending" state #182
* Fix Race condition due to `transfer` future `done_callback` #177

0.0.14 (2017-07-10)
-------------------
* Fix an issue where common prefixes in paths for upload and download were collapsed into only unique paths.

0.0.13 (2017-06-28)
-------------------
* Add support for automatic refreshing of service principal credentials

0.0.12 (2017-06-20)
-------------------
* Fix a regression with ls returning the top level folder if it has no contents. It now properly returns an empty array if a folder has no children.

0.0.11 (2017-06-02)
-------------------
* Update to name incomplete file downloads with a `.inprogress` suffix. This suffix is removed when the download completes successfully.

0.0.10 (2017-05-24)
-------------------
* Allow users to explicitly use or invalidate the internal, local cache of the filesystem that is built up from previous `ls` calls. It is now set to always call the service instead of the cache by default.
* Update to properly create the wheel package during build to ensure all pip packages are available.
* Update folder upload/download to properly throw early in the event that the destination files exist and overwrite was not specified. NOTE: target folder existence (or sub folder existence) does not automatically cause failure. Only leaf node existence will result in failure.
* Fix a bug that caused file not found errors when attempting to get information about the root folder.

0.0.9 (2017-05-09)
------------------
* Enforce basic SSL utilization to ensure performance due to `GitHub issue 625 <https://github.com/pyca/pyopenssl/issues/625>`

0.0.8 (2017-04-26)
------------------
* Fix server-side throttling retry support. This is not a guarantee that if the server is throttling the upload (or download) it will eventually succeed, but there is now a back-off retry in place to make it more likely.

0.0.7 (2017-04-19)
------------------
* Update the build process to more efficiently handle multi-part namespaces for pip.

0.0.6 (2017-03-15)
------------------
* Fix an issue with path caching that should drastically improve performance for download

0.0.5 (2017-03-01)
------------------
* Fix for downloader to ensure there is access to the source path before creating destination files
* Fix for credential objects to inherit from msrest.authentication for more universal authentication support
* Add support for the following:

  * set_expiry: allows for setting expiration on files
  * ACL management:

    * set_acl: allows for the full replacement of an ACL on a file or folder
    * set_acl_entries: allows for "patching" an existing ACL on a file or folder
    * get_acl_status: retrieves the ACL information for a file or folder
    * remove_acl_entries: removes the specified entries from an ACL on a file or folder
    * remove_acl: removes all non-default ACL entries from a file or folder
    * remove_default_acl: removes all default ACL entries from a folder

* Remove unsupported and unused "TRUNCATE" operation.
* Added API-Version support with a default of the latest api version (2016-11-01)

0.0.4 (2017-02-07)
------------------
* Fix for folder upload to properly delete folders with contents when overwrite specified.
* Fix to set verbose output to False/Off by default. This removes progress tracking output by default but drastically improves performance.

0.0.3 (2017-02-02)
------------------
* Fix to setup.py to include the HISTORY.rst file. No other changes

0.0.2 (2017-01-30)
------------------
* Addresses an issue with lib.auth() not properly defaulting to 2FA
* Fixes an issue with Overwrite for ADLUploader sometimes not being honored.
* Fixes an issue with empty files not properly being uploaded and resulting in a hang in progress tracking.
* Addition of a samples directory showcasing examples of how to use the client and upload and download logic.
* General cleanup of documentation and comments.
* This is still based on API version 2016-11-01

0.0.1 (2016-11-21)
------------------
* Initial preview release. Based on API version 2016-11-01.
* Includes initial ADLS filesystem functionality and extended upload and download support.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Azure/azure-data-lake-store-python",
    "name": "azure-datalake-store",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "azure",
    "author": "Microsoft Corporation",
    "author_email": "Akshat.Harit@microsoft.com",
    "download_url": "https://files.pythonhosted.org/packages/22/ff/61369d06422b5ac48067215ff404841342651b14a89b46c8d8e1507c8f17/azure-datalake-store-0.0.53.tar.gz",
    "platform": null,
    "description": "Microsoft Azure Data Lake Store Filesystem Library for Python\r\n=============================================================\r\n\r\n.. image:: https://travis-ci.org/Azure/azure-data-lake-store-python.svg?branch=dev\r\n    :target: https://travis-ci.org/Azure/azure-data-lake-store-python\r\n.. image:: https://coveralls.io/repos/github/Azure/azure-data-lake-store-python/badge.svg?branch=master\r\n    :target: https://coveralls.io/github/Azure/azure-data-lake-store-python?branch=master\r\n\r\nThis project is the Python filesystem library for Azure Data Lake Store.\r\n\r\nINSTALLATION\r\n============\r\n\r\nTo install from source instead of pip (for local testing and development):\r\n\r\n.. code-block:: bash\r\n\r\n    > pip install -r dev_requirements.txt\r\n    > python setup.py develop\r\n\r\nUsage: Sample Code\r\n==================\r\n\r\nTo play with the code, here is a starting point:\r\n\r\n.. code-block:: python\r\n\r\n    from azure.datalake.store import core, lib, multithread\r\n    token = lib.auth(tenant_id, username, password)\r\n    adl = core.AzureDLFileSystem(token, store_name=store_name)\r\n\r\n    # typical operations\r\n    adl.ls('')\r\n    adl.ls('tmp/', detail=True)\r\n    adl.ls('tmp/', detail=True, invalidate_cache=True)\r\n    adl.cat('littlefile')\r\n    adl.head('gdelt20150827.csv')\r\n\r\n    # file-like object\r\n    with adl.open('gdelt20150827.csv', blocksize=2**20) as f:\r\n        print(f.readline())\r\n        print(f.readline())\r\n        print(f.readline())\r\n        # could have passed f to any function requiring a file object:\r\n        # pandas.read_csv(f)\r\n\r\n    with adl.open('anewfile', 'wb') as f:\r\n        # data is written on flush/close, or when buffer is bigger than\r\n        # blocksize\r\n        f.write(b'important data')\r\n\r\n    adl.du('anewfile')\r\n\r\n    # recursively download the whole directory tree with 5 threads and\r\n    # 16MB chunks\r\n    multithread.ADLDownloader(adl, \"\", 'my_temp_dir', 5, 2**24)\r\n\r\nProgress can be tracked using a callback function in the form `track(current, total)`\r\nWhen passed, this will keep track of transferred bytes and be called on each complete chunk.\r\n\r\nHere's an example using the Azure CLI progress controller as the `progress_callback`:\r\n\r\n.. code-block:: python\r\n\r\n    from cli.core.application import APPLICATION\r\n\r\n    def _update_progress(current, total):\r\n        hook = APPLICATION.get_progress_controller(det=True)\r\n        hook.add(message='Alive', value=current, total_val=total)\r\n        if total == current:\r\n            hook.end()\r\n\r\n    ...\r\n    ADLUploader(client, destination_path, source_path, thread_count, overwrite=overwrite,\r\n            chunksize=chunk_size,\r\n            buffersize=buffer_size,\r\n            blocksize=block_size,\r\n            progress_callback=_update_progress)\r\n\r\nThis will output a progress bar to the stdout:\r\n\r\n.. code-block:: bash\r\n\r\n    Alive[#########################                                       ]  40.0881%\r\n    \r\n    Finished[#############################################################]  100.0000%\r\n\r\nUsage: Command Line Sample\r\n==========================\r\n\r\nTo interact with the API at a higher-level, you can use the provided\r\ncommand-line interface in \"samples/cli.py\". You will need to set\r\nthe appropriate environment variables \r\n\r\n* :code:`azure_username`\r\n\r\n* :code:`azure_password`\r\n\r\n* :code:`azure_data_lake_store_name`\r\n\r\n* :code:`azure_subscription_id`\r\n\r\n* :code:`azure_resource_group_name`\r\n\r\n* :code:`azure_service_principal`\r\n\r\n* :code:`azure_service_principal_secret`\r\n\r\nto connect to the Azure Data Lake Store. Optionally, you may need to define :code:`azure_tenant_id` or :code:`azure_data_lake_store_url_suffix`.\r\n\r\nBelow is a simple sample, with more details beyond.\r\n\r\n\r\n.. code-block:: bash\r\n\r\n    python samples\\cli.py ls -l\r\n\r\nExecute the program without arguments to access documentation.\r\n\r\nTo start the CLI in interactive mode, run \"python samples/cli.py\"\r\nand then type \"help\" to see all available commands (similiar to Unix utilities):\r\n\r\n.. code-block:: bash\r\n\r\n    > python samples/cli.py\r\n    azure> help\r\n\r\n    Documented commands (type help <topic>):\r\n    ========================================\r\n    cat    chmod  close  du      get   help  ls     mv   quit  rmdir  touch\r\n    chgrp  chown  df     exists  head  info  mkdir  put  rm    tail\r\n\r\n    azure>\r\n\r\n\r\nWhile still in interactive mode, you can run \"ls -l\" to list the entries in the\r\nhome directory (\"help ls\" will show the command's usage details). If you're not\r\nfamiliar with the Unix/Linux \"ls\" command, the columns represent 1) permissions,\r\n2) file owner, 3) file group, 4) file size, 5-7) file's modification time, and\r\n8) file name.\r\n\r\n.. code-block:: bash\r\n\r\n    > python samples/cli.py\r\n    azure> ls -l\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1\r\n    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv\r\n    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp\r\n    azure> ls -l --human-readable\r\n    drwxrwx--- 0123abcd 0123abcd   0B Aug 02 12:44 azure1\r\n    -rwxrwx--- 0123abcd 0123abcd   1M Jul 25 18:33 abc.csv\r\n    -r-xr-xr-x 0123abcd 0123abcd  36B Jul 22 18:32 xyz.csv\r\n    drwxrwx--- 0123abcd 0123abcd   0B Aug 03 13:46 tmp\r\n    azure>\r\n\r\n\r\nTo download a remote file, run \"get remote-file [local-file]\". The second\r\nargument, \"local-file\", is optional. If not provided, the local file will be\r\nnamed after the remote file minus the directory path.\r\n\r\n.. code-block:: bash\r\n\r\n    > python samples/cli.py\r\n    azure> ls -l\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1\r\n    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv\r\n    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp\r\n    azure> get xyz.csv\r\n    2016-08-04 18:57:48,603 - ADLFS - DEBUG - Creating empty file xyz.csv\r\n    2016-08-04 18:57:48,604 - ADLFS - DEBUG - Fetch: xyz.csv, 0-36\r\n    2016-08-04 18:57:49,726 - ADLFS - DEBUG - Downloaded to xyz.csv, byte offset 0\r\n    2016-08-04 18:57:49,734 - ADLFS - DEBUG - File downloaded (xyz.csv -> xyz.csv)\r\n    azure>\r\n\r\n\r\nIt is also possible to run in command-line mode, allowing any available command\r\nto be executed separately without remaining in the interpreter.\r\n\r\nFor example, listing the entries in the home directory:\r\n\r\n.. code-block:: bash\r\n\r\n    > python samples/cli.py ls -l\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 02 12:44 azure1\r\n    -rwxrwx--- 0123abcd 0123abcd   1048576 Jul 25 18:33 abc.csv\r\n    -r-xr-xr-x 0123abcd 0123abcd        36 Jul 22 18:32 xyz.csv\r\n    drwxrwx--- 0123abcd 0123abcd         0 Aug 03 13:46 tmp\r\n    >\r\n\r\n\r\nAlso, downloading a remote file:\r\n\r\n.. code-block:: bash\r\n\r\n    > python samples/cli.py get xyz.csv\r\n    2016-08-04 18:57:48,603 - ADLFS - DEBUG - Creating empty file xyz.csv\r\n    2016-08-04 18:57:48,604 - ADLFS - DEBUG - Fetch: xyz.csv, 0-36\r\n    2016-08-04 18:57:49,726 - ADLFS - DEBUG - Downloaded to xyz.csv, byte offset 0\r\n    2016-08-04 18:57:49,734 - ADLFS - DEBUG - File downloaded (xyz.csv -> xyz.csv)\r\n    >\r\n\r\nTests\r\n=====\r\n\r\nFor detailed documentation about our test framework, please visit the \r\n`tests folder <https://github.com/Azure/azure-data-lake-store-python/tree/master/tests>`__.\r\n\r\nNeed Help?\r\n==========\r\n\r\nBe sure to check out the Microsoft Azure `Developer Forums on Stack Overflow <http://go.microsoft.com/fwlink/?LinkId=234489>`__\r\nif you have trouble with the provided code. Most questions are tagged `azure and python <https://stackoverflow.com/questions/tagged/azure+python>`__.\r\n\r\n\r\nContribute Code or Provide Feedback\r\n===================================\r\n\r\nIf you would like to become an active contributor to this project please\r\nfollow the instructions provided in `Microsoft Azure Projects Contribution Guidelines <http://azure.github.io/guidelines/>`__. \r\nFurthermore, check out `GUIDANCE.md <https://github.com/Azure/azure-data-lake-store-python/blob/master/GUIDANCE.md>`__ \r\nfor specific information related to this project.\r\n\r\nIf you encounter any bugs with the library please file an issue in the\r\n`Issues <https://github.com/Azure/azure-data-lake-store-python/issues>`__\r\nsection of the project.\r\n\r\n\r\nCode of Conduct\r\n===============\r\nThis project has adopted the `Microsoft Open Source Code of Conduct <https://opensource.microsoft.com/codeofconduct/>`__. \r\nFor more information see the `Code of Conduct FAQ <https://opensource.microsoft.com/codeofconduct/faq/>`__ or contact \r\n`opencode@microsoft.com <mailto:opencode@microsoft.com>`__ with any additional questions or comments.\r\n\r\n\r\n.. :changelog:\r\n\r\nRelease History\r\n===============\r\n\r\n0.0.53 (2023-04-11)\r\n+++++++++++++++++++\r\n* Add MSAL support. Remove ADAL support\r\n* Suppress deprecation warning when detecting pyopenssl existence.\r\n\r\n0.0.52 (2020-11-25)\r\n+++++++++++++++++++\r\n* Changed logging verbosity when closing a stream\r\n* Filter out default acl for files when using recursive acl operations\r\n\r\n0.0.51 (2020-10-15)\r\n+++++++++++++++++++\r\n* Add more logging for zero byte reads to investigate root cause.\r\n\r\n0.0.50 (2020-09-10)\r\n+++++++++++++++++++\r\n* Fix bug with retrying for ADAL exception parsing.\r\n\r\n0.0.49 (2020-08-05)\r\n+++++++++++++++++++\r\n* Fix bug with NoRetryPolicy\r\n* Remove Python 3.4,5 in travis configuration.\r\n* Fix logging for unicode\r\n\r\n0.0.48 (2019-10-18)\r\n+++++++++++++++++++\r\n* Buffer writes to prevent out of memory problems\r\n* Add Python 3.7 in travis configuration\r\n\r\n0.0.47 (2019-08-14)\r\n+++++++++++++++++++\r\n* Remove logging of bearer token\r\n* Documentation related changes(Add readme.md and correct some formatting)\r\n\r\n0.0.46 (2019-06-25)\r\n+++++++++++++++++++\r\n* Expose per request timeout. Default to 60.\r\n* Concat will not retry by default.\r\n* Bug fixes.\r\n\r\n0.0.45 (2019-05-10)\r\n+++++++++++++++++++\r\n* Update open and close ADLFile semantics\r\n* Refactor code and improve performance for opening a file\r\n\r\n0.0.44 (2019-03-05)\r\n+++++++++++++++++++\r\n* Add continuation token to LISTSTATUS api call\r\n* Update api-version to 2018-09-01\r\n\r\n0.0.43 (2019-03-01)\r\n+++++++++++++++++++\r\n* Fix bug in downloader when glob returns a single file\r\n\r\n0.0.42 (2019-02-26)\r\n+++++++++++++++++++\r\n* Update docstrings\r\n* Remove logging setlevel to DEBUG for recursive acl operations\r\n\r\n0.0.41 (2019-01-31)\r\n+++++++++++++++++++\r\n* Remove GetContentSummary api call\r\n* Move check_token() under retry block\r\n* Expose timeout parameter for AdlDownloader and AdlUploader\r\n* Raise an exception instead of silently break for zero length reads\r\n\r\n0.0.40 (2019-01-08)\r\n+++++++++++++++++++\r\n* Fix zero length read\r\n* Remove dependence on custom wheel and conform to PEP 420\r\n\r\n0.0.39 (2018-11-14)\r\n+++++++++++++++++++\r\n* Fix for Chunked Decoding exception thrown while reading response.content\r\n\r\n0.0.38 (2018-11-12)\r\n+++++++++++++++++++\r\n* Added support for recursive acl functions\r\n* Fixed bug due to missing filesessionid in get_chunk\r\n\r\n0.0.37 (2018-11-02)\r\n+++++++++++++++++++\r\n* Reverted some changes introduced in 0.0.35 that didn't work with other tokens\r\n\r\n0.0.36 (2018-10-31)\r\n+++++++++++++++++++\r\n* Fixed typo in refresh_token call\r\n\r\n0.0.35 (2018-10-29)\r\n+++++++++++++++++++\r\n* Added retry for getting tokens\r\n* Added requests>=2.20 because of CVE 2018-18074\r\n* Fixed test parameters and updated test recordings\r\n\r\n0.0.34 (2018-10-15)\r\n+++++++++++++++++++\r\n* Fixed concat issue with plus(or other symbols) in filename\r\n* Added readinto method\r\n* Changed api-version to 2018-05-01 for all.\r\n\r\n0.0.32 (2018-10-04)\r\n+++++++++++++++++++\r\n* Fixed test bug\r\n* Fixed empty folder upload bug\r\n* Fixed ADL Downloader block size bug\r\n\r\n0.0.31 (2018-09-10)\r\n+++++++++++++++++++\r\n* Added support for batched ls\r\n\r\n0.0.30 (2018-08-28)\r\n+++++++++++++++++++\r\n* Fixed .travis.yml order to add azure-nspg dependency\r\n\r\n0.0.29 (2018-08-22)\r\n+++++++++++++++++++\r\n* Fixed HISTORY.rst and Pypi documentation\r\n\r\n0.0.28 (2018-08-20)\r\n+++++++++++++++++++\r\n* Added recovery from DatalakeBadOffsetException\r\n\r\n0.0.27 (2018-08-08)\r\n+++++++++++++++++++\r\n* Fixed bug in single file check\r\n* Added Python2 exception compatibility\r\n\r\n0.0.26 (2018-08-03)\r\n+++++++++++++++++++\r\n* Fixed bug due to not importing errno\r\n* Fixed bug in os.makedirs race condition\r\n* Updated Readme with correct environment variables and fixed some links\r\n\r\n0.0.25 (2018-07-26)\r\n+++++++++++++++++++\r\n* Fixed downloading of empty directories and download of directory structure with only a single file\r\n\r\n0.0.24 (2018-07-16)\r\n+++++++++++++++++++\r\n* Retry policy implemented for all operations, default being Exponential Retry Policy\r\n\r\n0.0.23 (2018-07-11)\r\n+++++++++++++++++++\r\n* Fixed the incorrect download location in case of UNC local paths\r\n\r\n0.0.22 (2018-06-02)\r\n+++++++++++++++++++\r\n* Encoding filepaths in URI\r\n\r\n0.0.21 (2018-06-01)\r\n+++++++++++++++++++\r\n* Remove unused msrest dependency\r\n\r\n0.0.20 (2018-05-25)\r\n+++++++++++++++++++\r\n* Compatibility of the sdist with wheel 0.31.0\r\n\r\n0.0.19 (2018-03-14)\r\n-------------------\r\n* Fixed upload issue where destination filename was wrong while upload of directory with single file #208\r\n\r\n0.0.18 (2018-02-05)\r\n-------------------\r\n* Fixed read issue where whole file was cached while doing positional reads #198\r\n* Fixed readline as well for the same\r\n\r\n0.0.17 (2017-09-21)\r\n-------------------\r\n* Fixed README.rst indentation error\r\n* Changed management endpoint\r\n\r\n0.0.16 (2017-09-11)\r\n-------------------\r\n* Fixed Multi chunk transfer hangs as merging chunks fails #187\r\n* Added syncflag and leaseid in create, append calls.\r\n* Added filesessionid in create, append and open calls.\r\n\r\n0.0.15 (2017-07-26)\r\n-------------------\r\n* Enable Data Lake Store progress controller callback #174\r\n* Fix File state incorrectly marked as \"errored\" if contains chunks is \"pending\" state #182\r\n* Fix Race condition due to `transfer` future `done_callback` #177\r\n\r\n0.0.14 (2017-07-10)\r\n-------------------\r\n* Fix an issue where common prefixes in paths for upload and download were collapsed into only unique paths.\r\n\r\n0.0.13 (2017-06-28)\r\n-------------------\r\n* Add support for automatic refreshing of service principal credentials\r\n\r\n0.0.12 (2017-06-20)\r\n-------------------\r\n* Fix a regression with ls returning the top level folder if it has no contents. It now properly returns an empty array if a folder has no children.\r\n\r\n0.0.11 (2017-06-02)\r\n-------------------\r\n* Update to name incomplete file downloads with a `.inprogress` suffix. This suffix is removed when the download completes successfully.\r\n\r\n0.0.10 (2017-05-24)\r\n-------------------\r\n* Allow users to explicitly use or invalidate the internal, local cache of the filesystem that is built up from previous `ls` calls. It is now set to always call the service instead of the cache by default.\r\n* Update to properly create the wheel package during build to ensure all pip packages are available.\r\n* Update folder upload/download to properly throw early in the event that the destination files exist and overwrite was not specified. NOTE: target folder existence (or sub folder existence) does not automatically cause failure. Only leaf node existence will result in failure.\r\n* Fix a bug that caused file not found errors when attempting to get information about the root folder.\r\n\r\n0.0.9 (2017-05-09)\r\n------------------\r\n* Enforce basic SSL utilization to ensure performance due to `GitHub issue 625 <https://github.com/pyca/pyopenssl/issues/625>`\r\n\r\n0.0.8 (2017-04-26)\r\n------------------\r\n* Fix server-side throttling retry support. This is not a guarantee that if the server is throttling the upload (or download) it will eventually succeed, but there is now a back-off retry in place to make it more likely.\r\n\r\n0.0.7 (2017-04-19)\r\n------------------\r\n* Update the build process to more efficiently handle multi-part namespaces for pip.\r\n\r\n0.0.6 (2017-03-15)\r\n------------------\r\n* Fix an issue with path caching that should drastically improve performance for download\r\n\r\n0.0.5 (2017-03-01)\r\n------------------\r\n* Fix for downloader to ensure there is access to the source path before creating destination files\r\n* Fix for credential objects to inherit from msrest.authentication for more universal authentication support\r\n* Add support for the following:\r\n\r\n  * set_expiry: allows for setting expiration on files\r\n  * ACL management:\r\n\r\n    * set_acl: allows for the full replacement of an ACL on a file or folder\r\n    * set_acl_entries: allows for \"patching\" an existing ACL on a file or folder\r\n    * get_acl_status: retrieves the ACL information for a file or folder\r\n    * remove_acl_entries: removes the specified entries from an ACL on a file or folder\r\n    * remove_acl: removes all non-default ACL entries from a file or folder\r\n    * remove_default_acl: removes all default ACL entries from a folder\r\n\r\n* Remove unsupported and unused \"TRUNCATE\" operation.\r\n* Added API-Version support with a default of the latest api version (2016-11-01)\r\n\r\n0.0.4 (2017-02-07)\r\n------------------\r\n* Fix for folder upload to properly delete folders with contents when overwrite specified.\r\n* Fix to set verbose output to False/Off by default. This removes progress tracking output by default but drastically improves performance.\r\n\r\n0.0.3 (2017-02-02)\r\n------------------\r\n* Fix to setup.py to include the HISTORY.rst file. No other changes\r\n\r\n0.0.2 (2017-01-30)\r\n------------------\r\n* Addresses an issue with lib.auth() not properly defaulting to 2FA\r\n* Fixes an issue with Overwrite for ADLUploader sometimes not being honored.\r\n* Fixes an issue with empty files not properly being uploaded and resulting in a hang in progress tracking.\r\n* Addition of a samples directory showcasing examples of how to use the client and upload and download logic.\r\n* General cleanup of documentation and comments.\r\n* This is still based on API version 2016-11-01\r\n\r\n0.0.1 (2016-11-21)\r\n------------------\r\n* Initial preview release. Based on API version 2016-11-01.\r\n* Includes initial ADLS filesystem functionality and extended upload and download support.\r\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "Azure Data Lake Store Filesystem Client Library for Python",
    "version": "0.0.53",
    "project_urls": {
        "Homepage": "https://github.com/Azure/azure-data-lake-store-python"
    },
    "split_keywords": [
        "azure"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "882a75f56b14f115189155cf12e46b366ad1fe3357af5a1a7c09f7446662d617",
                "md5": "a2f720ae787fc2f9e6c5d28afd2f7763",
                "sha256": "a30c902a6e360aa47d7f69f086b426729784e71c536f330b691647a51dc42b2b"
            },
            "downloads": -1,
            "filename": "azure_datalake_store-0.0.53-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a2f720ae787fc2f9e6c5d28afd2f7763",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 55308,
            "upload_time": "2023-05-10T21:17:02",
            "upload_time_iso_8601": "2023-05-10T21:17:02.629008Z",
            "url": "https://files.pythonhosted.org/packages/88/2a/75f56b14f115189155cf12e46b366ad1fe3357af5a1a7c09f7446662d617/azure_datalake_store-0.0.53-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "22ff61369d06422b5ac48067215ff404841342651b14a89b46c8d8e1507c8f17",
                "md5": "5c2a6e35439ee42d46e6ea9052724c86",
                "sha256": "05b6de62ee3f2a0a6e6941e6933b792b800c3e7f6ffce2fc324bc19875757393"
            },
            "downloads": -1,
            "filename": "azure-datalake-store-0.0.53.tar.gz",
            "has_sig": false,
            "md5_digest": "5c2a6e35439ee42d46e6ea9052724c86",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 71430,
            "upload_time": "2023-05-10T21:17:05",
            "upload_time_iso_8601": "2023-05-10T21:17:05.665669Z",
            "url": "https://files.pythonhosted.org/packages/22/ff/61369d06422b5ac48067215ff404841342651b14a89b46c8d8e1507c8f17/azure-datalake-store-0.0.53.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-10 21:17:05",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Azure",
    "github_project": "azure-data-lake-store-python",
    "travis_ci": true,
    "coveralls": true,
    "github_actions": false,
    "appveyor": true,
    "lcname": "azure-datalake-store"
}
        
Elapsed time: 0.06212s