pdal-parallelizer


Namepdal-parallelizer JSON
Version 2.1.0 PyPI version JSON
download
home_page
SummaryA simple tool for use pdal with parallel execution
upload_time2023-04-18 09:23:18
maintainer
docs_urlNone
author
requires_python
licenseBSD 3-Clause License
keywords pdal parallelizer dask
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ================================================
PDAL-PARALLELIZER
================================================

Some processing on point clouds can be very time consuming, this problem can be solved by using several processes on a machine to run calculations in parallel. The pdal-parallelizer tool will allow you to fully use the power of your machine very simply to put it at the service of your processing.

pdal-parallelizer is a tool that allows you to process your point clouds through pipelines that will be executed on several cores of your machine. This tool uses the flexible open-source Python library Dask for the multiprocess side and allows you to use the power of the Point Data Abstraction Library, PDAL to write your pipelines.

It also protect you from any problem during the execution. Indeed, as the points clouds treatments can be really long, if something goes wrong during the execution you don’t want to restart this from the beginning. So pdal-parallelizer will serialize each pipeline to protect you from this.

Read the documentation for more details : https://pdal-parallelizer.readthedocs.io/

Installation
-----------------------------------------------

Using Pip
................................................

.. code-block::

  pip install pdal-parallelizer

Using Conda
................................................

.. code-block::

  conda install -c clementalba pdal-parallelizer
  
GitHub
................................................

The repository of pdal-parallelizer is available at https://github.com/meldig/pdal-parallelizer

Usage
-----------------------------------------------

Config file
................................................

Your configuration file must be like that : 

.. code-block:: json

  {
      "input": "The folder that contains your input files (or a file path)",
      "output": "The folder that will receive your output files",
      "temp": "The folder that will contains your temporary files"
      "pipeline": "Your pipeline path"
  }

Processing pipelines with API
.............................

.. code-block:: python

    from pdal_parallelizer import process_pipelines as process

    process(config="./config.json", input_type="single", timeout=500, n_workers=5, diagnostic=True)

Processing pipelines with CLI
................................................

.. code-block:: 

  pdal-parallelizer process-pipelines -c <config file> -it dir -nw <n_workers> -tpw <threads_per_worker> -dr <number of files> -d
  pdal-parallelizer process-pipelines -c <config file> -it single -nw <n_workers> -tpw <threads_per_worker> -ts <tiles size> -d -dr <number of tiles> -b <buffer size>

Requirements (only for pip installs)
...........................................

Python 3.9+ (eg conda install -c anaconda python)

PDAL 2.4+ (eg `conda install -c conda-forge pdal`)

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "pdal-parallelizer",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "pdal,parallelizer,dask",
    "author": "",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/08/d5/9d6376a44394296c0c5979015cbf8e002634e70096c48c13d3e2f5ffb53f/pdal-parallelizer-2.1.0.tar.gz",
    "platform": null,
    "description": "================================================\r\nPDAL-PARALLELIZER\r\n================================================\r\n\r\nSome processing on point clouds can be very time consuming, this problem can be solved by using several processes on a machine to run calculations in parallel. The pdal-parallelizer tool will allow you to fully use the power of your machine very simply to put it at the service of your processing.\r\n\r\npdal-parallelizer is a tool that allows you to process your point clouds through pipelines that will be executed on several cores of your machine. This tool uses the flexible open-source Python library Dask for the multiprocess side and allows you to use the power of the Point Data Abstraction Library, PDAL to write your pipelines.\r\n\r\nIt also protect you from any problem during the execution. Indeed, as the points clouds treatments can be really long, if something goes wrong during the execution you don\u2019t want to restart this from the beginning. So pdal-parallelizer will serialize each pipeline to protect you from this.\r\n\r\nRead the documentation for more details : https://pdal-parallelizer.readthedocs.io/\r\n\r\nInstallation\r\n-----------------------------------------------\r\n\r\nUsing Pip\r\n................................................\r\n\r\n.. code-block::\r\n\r\n  pip install pdal-parallelizer\r\n\r\nUsing Conda\r\n................................................\r\n\r\n.. code-block::\r\n\r\n  conda install -c clementalba pdal-parallelizer\r\n  \r\nGitHub\r\n................................................\r\n\r\nThe repository of pdal-parallelizer is available at https://github.com/meldig/pdal-parallelizer\r\n\r\nUsage\r\n-----------------------------------------------\r\n\r\nConfig file\r\n................................................\r\n\r\nYour configuration file must be like that : \r\n\r\n.. code-block:: json\r\n\r\n  {\r\n      \"input\": \"The folder that contains your input files (or a file path)\",\r\n      \"output\": \"The folder that will receive your output files\",\r\n      \"temp\": \"The folder that will contains your temporary files\"\r\n      \"pipeline\": \"Your pipeline path\"\r\n  }\r\n\r\nProcessing pipelines with API\r\n.............................\r\n\r\n.. code-block:: python\r\n\r\n    from pdal_parallelizer import process_pipelines as process\r\n\r\n    process(config=\"./config.json\", input_type=\"single\", timeout=500, n_workers=5, diagnostic=True)\r\n\r\nProcessing pipelines with CLI\r\n................................................\r\n\r\n.. code-block:: \r\n\r\n  pdal-parallelizer process-pipelines -c <config file> -it dir -nw <n_workers> -tpw <threads_per_worker> -dr <number of files> -d\r\n  pdal-parallelizer process-pipelines -c <config file> -it single -nw <n_workers> -tpw <threads_per_worker> -ts <tiles size> -d -dr <number of tiles> -b <buffer size>\r\n\r\nRequirements (only for pip installs)\r\n...........................................\r\n\r\nPython 3.9+ (eg conda install -c anaconda python)\r\n\r\nPDAL 2.4+ (eg `conda install -c conda-forge pdal`)\r\n",
    "bugtrack_url": null,
    "license": "BSD 3-Clause License",
    "summary": "A simple tool for use pdal with parallel execution",
    "version": "2.1.0",
    "split_keywords": [
        "pdal",
        "parallelizer",
        "dask"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fc0be6f55c76500b8596f3f2a26121a3b94d93ed1c95dcc2efc90393ecb2a0fb",
                "md5": "c31be4c937c3db071089fde2253fe0f7",
                "sha256": "57b03683c17c14b91e2c4edfa5be3aa6a38ce4280857ef4bc96ef31602d3654b"
            },
            "downloads": -1,
            "filename": "pdal_parallelizer-2.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c31be4c937c3db071089fde2253fe0f7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 12313,
            "upload_time": "2023-04-18T09:23:16",
            "upload_time_iso_8601": "2023-04-18T09:23:16.533894Z",
            "url": "https://files.pythonhosted.org/packages/fc/0b/e6f55c76500b8596f3f2a26121a3b94d93ed1c95dcc2efc90393ecb2a0fb/pdal_parallelizer-2.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "08d59d6376a44394296c0c5979015cbf8e002634e70096c48c13d3e2f5ffb53f",
                "md5": "e6341d1ae155ab0198322001e8f9e89e",
                "sha256": "f20a5bf7e4ff2e064348801844a21b0ea6c25145713084f355f2e083a0cd1b5b"
            },
            "downloads": -1,
            "filename": "pdal-parallelizer-2.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "e6341d1ae155ab0198322001e8f9e89e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 10705,
            "upload_time": "2023-04-18T09:23:18",
            "upload_time_iso_8601": "2023-04-18T09:23:18.718138Z",
            "url": "https://files.pythonhosted.org/packages/08/d5/9d6376a44394296c0c5979015cbf8e002634e70096c48c13d3e2f5ffb53f/pdal-parallelizer-2.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-04-18 09:23:18",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "pdal-parallelizer"
}
        
Elapsed time: 0.06100s