edb-deployment


Nameedb-deployment JSON
Version 3.13.0 PyPI version JSON
download
home_pagehttps://github.com/EnterpriseDB/postgres-deployment/
SummaryPostgres Deployment Scripts are an easy way to deploy PostgreSQL, EDB Postgres
upload_time2023-01-20 20:07:50
maintainer
docs_urlNone
authorEDB
requires_python>=2.7
licenseBSD
keywords postgresql edb epas cli deploy cloud aws rds aurora azure gcloud vmware
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Getting Started

`edb-deployment` tool is an easy way to provision Cloud resources and deploy
PostgreSQL, EDB Postgres Advanced Server and tools (high availability,
backup/recovery, monitoring, connection poolers). `edb-deployment` can also be
used to deploy Postgres architectures on existing infrastructure like physical
servers (baremetal) or local Virtual Machines.

Supported Cloud providers are **AWS**, **Azure** and **Google Cloud**.

`edb-deployment` helps user to deploy Postgres Reference Architectures. List
and details of the Reference Architecture can be find [here](https://github.com/EnterpriseDB/edb-ref-archs/blob/main/edb-reference-architecture-codes/README.md).

`edb-deployment` is an open source tool and is not officially supported by EDB
Support. It is maintained and supported by the GitHub members of this
repository. Please provide feedback by posting issues and providing pull
requests.

Before starting to delve into this repository, it is best to get familiar with
the steps in the deployment process of a specific cloud (AWS, Azure and Google
Cloud).

# Pre-Requisites

`edb-deployment` is dependent on following components. Install the following
components before using it.

1. Python 3
2. `pip3`

Third party pre-requisites:

1. **Latest vendor** Cloud CLI or SDK ( AWS, Azure or Google Cloud )

   Depending on the cloud provider, install the **latest version** for: AWS
   CLI, Azure CLI or Google Cloud SDK on the system.

2. **Terraform** >= 0.15.1
3. **Ansible** >= 2.10.8
4. **AWS CLI** >= 2.0.45
5. **Azure CLI** >= 2.23.0
6. **Google Cloud CLI** >= 329.0.0

To help the installation of the third party pre-requisites listed above,
`edb-deployment` provides the `setup` sub-command working for Linux and Darwin
(macOS).
Please refer to section [Pre-Requisites installation](#pre-requisites-installation).

# Installation

## From source code

Installation is done using the `pip3` command. Once the code has been
downloaded, either by cloning the repository or downloading a release, go to
the created folder and run the command `pip3 install`:
```shell
$ cd postgres-deployment
$ sudo pip3 install . --upgrade
```

## From Pypi

```shell
$ sudo pip3 install edb-deployment
```

Make sure the tool is well installed by running the command:
```shell
$ edb-deployment --version
```

## Shell auto-completion

`edb-deployment` supports command line auto-completion with the `tab` key.

Supported shells are `bash` ans `zsh`.

To enable auto-completion in current session, the following command must be
ran:
```shell
$ eval "$(register-python-argcomplete edb-deployment)"
```

To enable auto-completion for all the sessions, the command above must be added
at the end of your `~/.bashrc` file or `~/.zshrc` file, depending on the shell
you use.

## Pre-Requisites installation

To ease installation of the third party pre-requisites tools like `aws`,
`terraform`, `ansible`, etc.. `edb-deployment` provides the `setup`
sub-command.

The following packages are required in order to execute the `setup`
sub-command: `gcc` (Linux only), `python3-devel` (Linux only), `unzip`, `wget`,
`tar`.
These packages should be installed through usual package manager (`dnf`,
`apt`, `brew`, etc..).

Note for Debian users: the package `libffi-dev` must be present.

Finally, Python `virtualenv` must be installed with `root` privileges:
```shell
$ sudo pip3 install virtualenv
```

Pre-requisites automated installation:
```shell
$ edb-deployment <CLOUD_VENDOR> setup
```

# Usage

Each new deployment will be done under a dedicated name space, this the
`<PROJECT_NAME>`.

`edb-deployment` CLI features are implemented through sub-commands. Each
sub-command can be executed like this:
```shell
$ edb-deployment <CLOUD_VENDOR> <SUB_COMMAND> [<PROJECT_NAME>]
```

## Cloud vendor list

  * `aws`: Amazon Web Services
  * `aws-pot`: EDB POT (Proof Of Technology) on AWS Cloud
  * `aws-rds`: Amazon Web Services RDS for PostgreSQL
  * `aws-rds-aurora`: Amazon Aurora
  * `azure`: Microsoft Azure Cloud
  * `azure-pot`: EDB POT (Proof Of Technology) on Azure Cloud
  * `azure-db`: Microsoft Azure Database
  * `gcloud`: Google Cloud
  * `gcloud-pot`: EDB POT (Proof Of Technology) on Google Cloud
  * `gcloud-sql`: Google Cloud SQL for PostgreSQL

### Local Testing

  * `baremetal`: Baremetal servers and VMs
  * `vmware`: [VMWare Workstation](./VMWARE.md)
  * `virtualbox`: [Virtualbox](./VIRTUALBOX.md)
    * [Windows Support through WSL2](./README-WIN.md)

## Sub-commands

  * `configure`: New project initialization and configuration
  * `provision`: Cloud resources provisioning
  * `destroy`: Cloud resources destruction
  * `deploy`: Postgres and tools deployment
  * `show`: Show configuration
  * `display`: Display project inventory
  * `passwords`: Display project passwords
  * `list`: List projects
  * `specs`: Show Cloud Vendor default specifications
  * `logs`: Show project logs
  * `remove`: Remove project

# How to Use

Deployment of new project should follow the work flow below:

  1. [Configure Cloud credentials](#configure-cloud-credentials)
  2. [Project configuration](#project-configuration)
  3. [Cloud resources provisioning](#cloud-resources-provisioning)
  4. [Postgres and tools deployment](#postgres-and-tools-deployment)

## Configure Cloud credentials

This step depends on the target Cloud vendor.

If the Cloud tools have been installed with the help of the `setup`
sub-command, it's recommended to update the value of the `PATH` environment
variable to include tools binary location:
```shell
$ export PATH=$PATH:$HOME/.edb-cloud-tools/bin
```

### AWS credentials configuration

AWS credentials configuration will be done through `aws` tool. For this step,
we need to get your **AWS Access Key ID** and **AWS Secret Access Key**. For
more information about Amazon Access Key management, please go to
[official documentation page](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_CreateAccessKey).

Run the following command and enter the Access Key ID and Secret Access Key:
```shell
$ aws configure
```

### Azure credentials configuration

Azure Cloud credentials configuration can be achieved using the `az` tool with
the following command:
```shell
$ az login --use-device-code
```

### GCloud credentials configuration

GCloud credentials configuration includes more steps than the other Cloud
vendors. **GCloud project ID** is required.

  1. Login with your email address:
  ```shell
$ gcloud auth login <LOGIN_EMAIL> --no-launch-browser
  ```
  2. Open the link in your browser and copy the given verification code.
  3. Project configuration
  ```shell
$ gcloud config set project <PROJECT_ID>
  ```
  4. To find the IAM account of the service, please enter the following command
     to list service accounts:
  ```shell
$ gcloud iam service-accounts list
  ```
  5. Finally, to create and download a new service account key:
  ```shell
$ gcloud iam service-accounts keys create ~/accounts.json --iam-account=<IAM_ACCOUNT>
  ```

The JSON file `$HOME/accounts.json` must be kept and will be required by
`edb-deployment`.

## Project configuration

Once Cloud vendor credentials have been configured, you can proceed with
project configuration step.

### Cloud vendor specifications

`edb-deployment` brings default configuration values for the Cloud resources
to be provisioned, like **instance type**, **disk size**, **OS image**,
**additional volumes**, etc..

To change these configuration values, you need first to dump the default values
with:
```shell
$ edb-deployment <CLOUD_VENDOR> specs > my_configuration.json
```

When deploying on baremetal servers, IP address and SSH user configuration must
be done through the specifications:

```shell
$ edb-deployment baremetal specs --reference-architecture EDB-RA-1 > baremetal-edb-ra-1.json
```

The `baremetal-edb-ra-1.json` file will contain:
```json
{
  "ssh_user": null,
  "pg_data": null,
  "pg_wal": null,
  "postgres_server_1": {
    "name": "pg1",
    "public_ip": null,
    "private_ip": null
  },
  "pem_server_1": {
    "name": "pem1",
    "public_ip": null,
    "private_ip": null
  },
  "backup_server_1": {
    "name": "backup1",
    "public_ip": null,
    "private_ip": null
  }
}
```

Then, you can edit and update resources configuration stored in the JSON file.

### Project initialization

Project initialialization will done using the `configure` sub-command:
```shell
$ edb-deployment <CLOUD_VENDOR> configure <PROJECT_NAME> \
  -a <REFERENCE_ARCHITECTURE_CODE> \
  -o <OPERATING_SYSTEM> \
  -t <PG_ENGINE_TYPE> \
  -v <PG_VERSION> \
  -u "<EDB_REPO_USERNAME>:<EDB_REPO_PASSWORD>" \
  -r <CLOUD_REGION> \
  -s my_configuration.json
```

Notes:
  * `REFERENCE_ARCHITECTURE_CODE`

    Reference architecture code name. Allowed values are: **EDB-RA-1** for a
    single Postgres node deployment with one backup server and one PEM
    monitoring server, **EDB-RA-2** for a 3 Postgres nodes deployment with
    quorum base synchronous replication and automatic failover, one backup
    server and one PEM monitoring server, **EDB-RA-3** for extending
    **EDB-RA-2** with 3 PgPoolII nodes,  and **HammerDB-TPROC-C** for setting up
    a 2-tier configuration for benchmarking with an OLTP workload.  Default:
    **EDB-RA-1**

  * `OPERATING_SYSTEM`

    Operating system. Allowed values are: **CentOS7**, **RockyLinux8**, **RedHat7**
    and **RedHat8**. Default: **RockyLinux8**

  * `PG_ENGINE_TYPE`

     Postgres engine type. Allowed values are: **PG** for PostgreSQL, **EPAS**
     for EDB Postgres Advanced Server. Default: **PG**

  * `PG_VERSION`

    PostgreSQL or EPAS version. Allowed values are: **11**, **12**, **13** and **14**.
    Default: **14**

  * `"EDB_REPO_USERNAME:EDB_REPO_PASSWORD"`

    EDB Packages repository credentials. **Required**.

  * `CLOUD_REGION`

    Cloud vendor region. Default value depends on Cloud vendor.

For more details, please use:
```shell
$ edb-deployment <CLOUD_VENDOR> configure --help
```

## Cloud resources provisioning

After project configuration, we can proceed to Cloud resources provisioning:
```shell
$ edb-deployment <CLOUD_VENDOR> provision <PROJECT_NAME>
```

## Components deployment

Finally, we can deploy the components with the `deploy` sub-command:
```shell
$ edb-deployment <CLOUD_VENDOR> deploy <PROJECT_NAME>
```

## Other features

List of projects:
```shell
$ edb-deployment <CLOUD_VENDOR> list
```

Execute Ansible pre deployment playbook
```shell
$ edb-deployment <CLOUD_VENDOR> deploy --pre-deploy-ansible pre_deploy_playbook.yml <PROJECT_NAME>
```

Execute Ansible post deployment playbook
```shell
$ edb-deployment <CLOUD_VENDOR> deploy --post-deploy-ansible post_deploy_playbook.yml <PROJECT_NAME>
```

Display of projects inventory:
```shell
$ edb-deployment <CLOUD_VENDOR> display <PROJECT_NAME>
```

Display of projects passwords:
```shell
$ edb-deployment <CLOUD_VENDOR> passwords <PROJECT_NAME>
```

Cloud resources destruction:
```shell
$ edb-deployment <CLOUD_VENDOR> destroy <PROJECT_NAME>
```

Project tree deletion:
```shell
$ edb-deployment <CLOUD_VENDOR> remove <PROJECT_NAME>
```

Open SSH connection to one host of the project:
```shell
$ edb-deployment <CLOUD_VENDOR> ssh <PROJECT_NAME> <NODE_NAME>
```

Note: the node name list can be found by executing the `display` sub-command.
It can be: `epas1`, `epas2`, `pemserver1`, etc...

Get a copy of the SSH keys and the ssh_config file. The files are copied into
the current directory:
```shell
$ edb-deployment <CLOUD_VENDOR> get_ssh_keys <PROJECT_NAME>
```

Note: This sub-command is only available for `aws-pot`, `azure-pot` and
`gcloud-pot`.

# License

Original work Copyright 2019-2020, EnterpriseDB Corporation

All rights reserved.

Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:

1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.

2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.

3. Neither the name of EnterpriseDB nor the names of its contributors may be
used to endorse or promote products derived from this software without specific
prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/EnterpriseDB/postgres-deployment/",
    "name": "edb-deployment",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=2.7",
    "maintainer_email": "",
    "keywords": "postgresql edb epas cli deploy cloud aws rds aurora azure gcloud vmware",
    "author": "EDB",
    "author_email": "edb-devops@enterprisedb.com",
    "download_url": "https://files.pythonhosted.org/packages/11/b1/d26f9bf9bbc65874b6ac87838645e8d7111f264d941af8c9e7369e9ab53a/edb-deployment-3.13.0.tar.gz",
    "platform": null,
    "description": "# Getting Started\n\n`edb-deployment` tool is an easy way to provision Cloud resources and deploy\nPostgreSQL, EDB Postgres Advanced Server and tools (high availability,\nbackup/recovery, monitoring, connection poolers). `edb-deployment` can also be\nused to deploy Postgres architectures on existing infrastructure like physical\nservers (baremetal) or local Virtual Machines.\n\nSupported Cloud providers are **AWS**, **Azure** and **Google Cloud**.\n\n`edb-deployment` helps user to deploy Postgres Reference Architectures. List\nand details of the Reference Architecture can be find [here](https://github.com/EnterpriseDB/edb-ref-archs/blob/main/edb-reference-architecture-codes/README.md).\n\n`edb-deployment` is an open source tool and is not officially supported by EDB\nSupport. It is maintained and supported by the GitHub members of this\nrepository. Please provide feedback by posting issues and providing pull\nrequests.\n\nBefore starting to delve into this repository, it is best to get familiar with\nthe steps in the deployment process of a specific cloud (AWS, Azure and Google\nCloud).\n\n# Pre-Requisites\n\n`edb-deployment` is dependent on following components. Install the following\ncomponents before using it.\n\n1. Python 3\n2. `pip3`\n\nThird party pre-requisites:\n\n1. **Latest vendor** Cloud CLI or SDK ( AWS, Azure or Google Cloud )\n\n   Depending on the cloud provider, install the **latest version** for: AWS\n   CLI, Azure CLI or Google Cloud SDK on the system.\n\n2. **Terraform** >= 0.15.1\n3. **Ansible** >= 2.10.8\n4. **AWS CLI** >= 2.0.45\n5. **Azure CLI** >= 2.23.0\n6. **Google Cloud CLI** >= 329.0.0\n\nTo help the installation of the third party pre-requisites listed above,\n`edb-deployment` provides the `setup` sub-command working for Linux and Darwin\n(macOS).\nPlease refer to section [Pre-Requisites installation](#pre-requisites-installation).\n\n# Installation\n\n## From source code\n\nInstallation is done using the `pip3` command. Once the code has been\ndownloaded, either by cloning the repository or downloading a release, go to\nthe created folder and run the command `pip3 install`:\n```shell\n$ cd postgres-deployment\n$ sudo pip3 install . --upgrade\n```\n\n## From Pypi\n\n```shell\n$ sudo pip3 install edb-deployment\n```\n\nMake sure the tool is well installed by running the command:\n```shell\n$ edb-deployment --version\n```\n\n## Shell auto-completion\n\n`edb-deployment` supports command line auto-completion with the `tab` key.\n\nSupported shells are `bash` ans `zsh`.\n\nTo enable auto-completion in current session, the following command must be\nran:\n```shell\n$ eval \"$(register-python-argcomplete edb-deployment)\"\n```\n\nTo enable auto-completion for all the sessions, the command above must be added\nat the end of your `~/.bashrc` file or `~/.zshrc` file, depending on the shell\nyou use.\n\n## Pre-Requisites installation\n\nTo ease installation of the third party pre-requisites tools like `aws`,\n`terraform`, `ansible`, etc.. `edb-deployment` provides the `setup`\nsub-command.\n\nThe following packages are required in order to execute the `setup`\nsub-command: `gcc` (Linux only), `python3-devel` (Linux only), `unzip`, `wget`,\n`tar`.\nThese packages should be installed through usual package manager (`dnf`,\n`apt`, `brew`, etc..).\n\nNote for Debian users: the package `libffi-dev` must be present.\n\nFinally, Python `virtualenv` must be installed with `root` privileges:\n```shell\n$ sudo pip3 install virtualenv\n```\n\nPre-requisites automated installation:\n```shell\n$ edb-deployment <CLOUD_VENDOR> setup\n```\n\n# Usage\n\nEach new deployment will be done under a dedicated name space, this the\n`<PROJECT_NAME>`.\n\n`edb-deployment` CLI features are implemented through sub-commands. Each\nsub-command can be executed like this:\n```shell\n$ edb-deployment <CLOUD_VENDOR> <SUB_COMMAND> [<PROJECT_NAME>]\n```\n\n## Cloud vendor list\n\n  * `aws`: Amazon Web Services\n  * `aws-pot`: EDB POT (Proof Of Technology) on AWS Cloud\n  * `aws-rds`: Amazon Web Services RDS for PostgreSQL\n  * `aws-rds-aurora`: Amazon Aurora\n  * `azure`: Microsoft Azure Cloud\n  * `azure-pot`: EDB POT (Proof Of Technology) on Azure Cloud\n  * `azure-db`: Microsoft Azure Database\n  * `gcloud`: Google Cloud\n  * `gcloud-pot`: EDB POT (Proof Of Technology) on Google Cloud\n  * `gcloud-sql`: Google Cloud SQL for PostgreSQL\n\n### Local Testing\n\n  * `baremetal`: Baremetal servers and VMs\n  * `vmware`: [VMWare Workstation](./VMWARE.md)\n  * `virtualbox`: [Virtualbox](./VIRTUALBOX.md)\n    * [Windows Support through WSL2](./README-WIN.md)\n\n## Sub-commands\n\n  * `configure`: New project initialization and configuration\n  * `provision`: Cloud resources provisioning\n  * `destroy`: Cloud resources destruction\n  * `deploy`: Postgres and tools deployment\n  * `show`: Show configuration\n  * `display`: Display project inventory\n  * `passwords`: Display project passwords\n  * `list`: List projects\n  * `specs`: Show Cloud Vendor default specifications\n  * `logs`: Show project logs\n  * `remove`: Remove project\n\n# How to Use\n\nDeployment of new project should follow the work flow below:\n\n  1. [Configure Cloud credentials](#configure-cloud-credentials)\n  2. [Project configuration](#project-configuration)\n  3. [Cloud resources provisioning](#cloud-resources-provisioning)\n  4. [Postgres and tools deployment](#postgres-and-tools-deployment)\n\n## Configure Cloud credentials\n\nThis step depends on the target Cloud vendor.\n\nIf the Cloud tools have been installed with the help of the `setup`\nsub-command, it's recommended to update the value of the `PATH` environment\nvariable to include tools binary location:\n```shell\n$ export PATH=$PATH:$HOME/.edb-cloud-tools/bin\n```\n\n### AWS credentials configuration\n\nAWS credentials configuration will be done through `aws` tool. For this step,\nwe need to get your **AWS Access Key ID** and **AWS Secret Access Key**. For\nmore information about Amazon Access Key management, please go to\n[official documentation page](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html#Using_CreateAccessKey).\n\nRun the following command and enter the Access Key ID and Secret Access Key:\n```shell\n$ aws configure\n```\n\n### Azure credentials configuration\n\nAzure Cloud credentials configuration can be achieved using the `az` tool with\nthe following command:\n```shell\n$ az login --use-device-code\n```\n\n### GCloud credentials configuration\n\nGCloud credentials configuration includes more steps than the other Cloud\nvendors. **GCloud project ID** is required.\n\n  1. Login with your email address:\n  ```shell\n$ gcloud auth login <LOGIN_EMAIL> --no-launch-browser\n  ```\n  2. Open the link in your browser and copy the given verification code.\n  3. Project configuration\n  ```shell\n$ gcloud config set project <PROJECT_ID>\n  ```\n  4. To find the IAM account of the service, please enter the following command\n     to list service accounts:\n  ```shell\n$ gcloud iam service-accounts list\n  ```\n  5. Finally, to create and download a new service account key:\n  ```shell\n$ gcloud iam service-accounts keys create ~/accounts.json --iam-account=<IAM_ACCOUNT>\n  ```\n\nThe JSON file `$HOME/accounts.json` must be kept and will be required by\n`edb-deployment`.\n\n## Project configuration\n\nOnce Cloud vendor credentials have been configured, you can proceed with\nproject configuration step.\n\n### Cloud vendor specifications\n\n`edb-deployment` brings default configuration values for the Cloud resources\nto be provisioned, like **instance type**, **disk size**, **OS image**,\n**additional volumes**, etc..\n\nTo change these configuration values, you need first to dump the default values\nwith:\n```shell\n$ edb-deployment <CLOUD_VENDOR> specs > my_configuration.json\n```\n\nWhen deploying on baremetal servers, IP address and SSH user configuration must\nbe done through the specifications:\n\n```shell\n$ edb-deployment baremetal specs --reference-architecture EDB-RA-1 > baremetal-edb-ra-1.json\n```\n\nThe `baremetal-edb-ra-1.json` file will contain:\n```json\n{\n  \"ssh_user\": null,\n  \"pg_data\": null,\n  \"pg_wal\": null,\n  \"postgres_server_1\": {\n    \"name\": \"pg1\",\n    \"public_ip\": null,\n    \"private_ip\": null\n  },\n  \"pem_server_1\": {\n    \"name\": \"pem1\",\n    \"public_ip\": null,\n    \"private_ip\": null\n  },\n  \"backup_server_1\": {\n    \"name\": \"backup1\",\n    \"public_ip\": null,\n    \"private_ip\": null\n  }\n}\n```\n\nThen, you can edit and update resources configuration stored in the JSON file.\n\n### Project initialization\n\nProject initialialization will done using the `configure` sub-command:\n```shell\n$ edb-deployment <CLOUD_VENDOR> configure <PROJECT_NAME> \\\n  -a <REFERENCE_ARCHITECTURE_CODE> \\\n  -o <OPERATING_SYSTEM> \\\n  -t <PG_ENGINE_TYPE> \\\n  -v <PG_VERSION> \\\n  -u \"<EDB_REPO_USERNAME>:<EDB_REPO_PASSWORD>\" \\\n  -r <CLOUD_REGION> \\\n  -s my_configuration.json\n```\n\nNotes:\n  * `REFERENCE_ARCHITECTURE_CODE`\n\n    Reference architecture code name. Allowed values are: **EDB-RA-1** for a\n    single Postgres node deployment with one backup server and one PEM\n    monitoring server, **EDB-RA-2** for a 3 Postgres nodes deployment with\n    quorum base synchronous replication and automatic failover, one backup\n    server and one PEM monitoring server, **EDB-RA-3** for extending\n    **EDB-RA-2** with 3 PgPoolII nodes,  and **HammerDB-TPROC-C** for setting up\n    a 2-tier configuration for benchmarking with an OLTP workload.  Default:\n    **EDB-RA-1**\n\n  * `OPERATING_SYSTEM`\n\n    Operating system. Allowed values are: **CentOS7**, **RockyLinux8**, **RedHat7**\n    and **RedHat8**. Default: **RockyLinux8**\n\n  * `PG_ENGINE_TYPE`\n\n     Postgres engine type. Allowed values are: **PG** for PostgreSQL, **EPAS**\n     for EDB Postgres Advanced Server. Default: **PG**\n\n  * `PG_VERSION`\n\n    PostgreSQL or EPAS version. Allowed values are: **11**, **12**, **13** and **14**.\n    Default: **14**\n\n  * `\"EDB_REPO_USERNAME:EDB_REPO_PASSWORD\"`\n\n    EDB Packages repository credentials. **Required**.\n\n  * `CLOUD_REGION`\n\n    Cloud vendor region. Default value depends on Cloud vendor.\n\nFor more details, please use:\n```shell\n$ edb-deployment <CLOUD_VENDOR> configure --help\n```\n\n## Cloud resources provisioning\n\nAfter project configuration, we can proceed to Cloud resources provisioning:\n```shell\n$ edb-deployment <CLOUD_VENDOR> provision <PROJECT_NAME>\n```\n\n## Components deployment\n\nFinally, we can deploy the components with the `deploy` sub-command:\n```shell\n$ edb-deployment <CLOUD_VENDOR> deploy <PROJECT_NAME>\n```\n\n## Other features\n\nList of projects:\n```shell\n$ edb-deployment <CLOUD_VENDOR> list\n```\n\nExecute Ansible pre deployment playbook\n```shell\n$ edb-deployment <CLOUD_VENDOR> deploy --pre-deploy-ansible pre_deploy_playbook.yml <PROJECT_NAME>\n```\n\nExecute Ansible post deployment playbook\n```shell\n$ edb-deployment <CLOUD_VENDOR> deploy --post-deploy-ansible post_deploy_playbook.yml <PROJECT_NAME>\n```\n\nDisplay of projects inventory:\n```shell\n$ edb-deployment <CLOUD_VENDOR> display <PROJECT_NAME>\n```\n\nDisplay of projects passwords:\n```shell\n$ edb-deployment <CLOUD_VENDOR> passwords <PROJECT_NAME>\n```\n\nCloud resources destruction:\n```shell\n$ edb-deployment <CLOUD_VENDOR> destroy <PROJECT_NAME>\n```\n\nProject tree deletion:\n```shell\n$ edb-deployment <CLOUD_VENDOR> remove <PROJECT_NAME>\n```\n\nOpen SSH connection to one host of the project:\n```shell\n$ edb-deployment <CLOUD_VENDOR> ssh <PROJECT_NAME> <NODE_NAME>\n```\n\nNote: the node name list can be found by executing the `display` sub-command.\nIt can be: `epas1`, `epas2`, `pemserver1`, etc...\n\nGet a copy of the SSH keys and the ssh_config file. The files are copied into\nthe current directory:\n```shell\n$ edb-deployment <CLOUD_VENDOR> get_ssh_keys <PROJECT_NAME>\n```\n\nNote: This sub-command is only available for `aws-pot`, `azure-pot` and\n`gcloud-pot`.\n\n# License\n\nOriginal work Copyright 2019-2020, EnterpriseDB Corporation\n\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n1. Redistributions of source code must retain the above copyright notice, this\nlist of conditions and the following disclaimer.\n\n2. Redistributions in binary form must reproduce the above copyright notice,\nthis list of conditions and the following disclaimer in the documentation\nand/or other materials provided with the distribution.\n\n3. Neither the name of EnterpriseDB nor the names of its contributors may be\nused to endorse or promote products derived from this software without specific\nprior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR\nANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\nLOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON\nANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE.\n",
    "bugtrack_url": null,
    "license": "BSD",
    "summary": "Postgres Deployment Scripts are an easy way to deploy PostgreSQL, EDB Postgres",
    "version": "3.13.0",
    "split_keywords": [
        "postgresql",
        "edb",
        "epas",
        "cli",
        "deploy",
        "cloud",
        "aws",
        "rds",
        "aurora",
        "azure",
        "gcloud",
        "vmware"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "11b1d26f9bf9bbc65874b6ac87838645e8d7111f264d941af8c9e7369e9ab53a",
                "md5": "89259d1bc5924535854ea4816f2006f5",
                "sha256": "b7d9af9c5862dec5851110c31d68d0cc1b491b558bb5e6151c01436c7133854a"
            },
            "downloads": -1,
            "filename": "edb-deployment-3.13.0.tar.gz",
            "has_sig": false,
            "md5_digest": "89259d1bc5924535854ea4816f2006f5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=2.7",
            "size": 124015,
            "upload_time": "2023-01-20T20:07:50",
            "upload_time_iso_8601": "2023-01-20T20:07:50.839007Z",
            "url": "https://files.pythonhosted.org/packages/11/b1/d26f9bf9bbc65874b6ac87838645e8d7111f264d941af8c9e7369e9ab53a/edb-deployment-3.13.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-01-20 20:07:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "EnterpriseDB",
    "github_project": "postgres-deployment",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "edb-deployment"
}
        
EDB
Elapsed time: 0.03068s