airflowctl


Nameairflowctl JSON
Version 0.2.9 PyPI version JSON
download
home_pagehttps://github.com/kaxil/airflowctl
SummaryA CLI tool to streamline getting started with Apache Airflow™ and managing multiple Airflow projects.
upload_time2023-10-05 02:13:22
maintainer
docs_urlNone
authorKaxil Naik
requires_python>=3.7,<4.0
licenseApache-2.0
keywords airflow installer cli apache-airflow
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # airflowctl

[![PyPI](https://img.shields.io/pypi/v/airflowctl)](https://pypi.org/project/airflowctl/)
[![License](https://img.shields.io/:license-Apache%202-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0.txt)
[![Python](https://img.shields.io/pypi/pyversions/airflowctl.svg)](https://pypi.python.org/pypi/airflowctl)
[![PyPI - Downloads](https://img.shields.io/pypi/dm/airflowctl)](https://pypi.org/project/airflowctl/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![pre-commit.ci status](https://results.pre-commit.ci/badge/github/kaxil/airflowctl/main.svg)](https://results.pre-commit.ci/latest/github/kaxil/airflowctl/main)

`airflowctl` is a command-line tool for managing Apache Airflow™ projects.
It provides a set of commands to initialize, build, start, stop, and manage Airflow projects.
With `airflowctl`, you can easily set up and manage your Airflow projects, install
specific versions of Apache Airflow, and manage virtual environments.

The main goal of `airflowctl` is for first-time Airflow users to install and setup Airflow using a single command and
for existing Airflow users to manage multiple Airflow projects with different Airflow versions on the same machine.

## Features

- **Project Initialization with Connections & Variables:** Initialize a new Airflow project with customizable project name, Apache Airflow
version, and Python version. It also allows you to manage Airflow connections and variables.
- **Automatic Virtual Environment Management:** Automatically create and manage virtual environments for
your Airflow projects, even for Python versions that are not installed on your system.
- **Airflow Version Management:** Install and manage specific versions of Apache Airflow.
- **Background Process Management:** Start and stop Airflow in the background with process management capabilities.
- **Live Logs Display:** Continuously display live logs of background Airflow processes with optional log filtering.

## Table of Contents

- [Installation](#installation)
- [Quickstart](#quickstart)
- [Usage](#usage)
  - [Step 1: Initialize a New Project](#step-1-initialize-a-new-project)
  - [Step 2: Build the Project](#step-2-build-the-project)
  - [Step 3: Start Airflow](#step-3-start-airflow)
  - [Step 4: Monitor Logs](#step-4-monitor-logs)
  - [Step 5: Stop Airflow](#step-5-stop-airflow)
  - [Step 6: List Airflow Projects](#step-6-list-airflow-projects)
  - [Step 7: Show Project Info](#step-7-show-project-info)
  - [Step 8: Running Airflow Commands](#step-8-running-airflow-commands)
  - [Step 9: Changing Airflow configuration](#step-9-changing-airflow-configurations)
- [Using with other Airflow tools](#using-with-other-airflow-tools)
  - [Astro CLI](#astro-cli)

## Installation

```bash
pip install airflowctl
```

## Quickstart

To initialize a new Airflow project with the latest airflow version, build a Virtual environment
and run the project, run the following command:

```shell
airflowctl init my_airflow_project --build-start
```

This will start Airflow and display the logs in the terminal. You can
access the Airflow UI at http://localhost:8080. To stop Airflow, press `Ctrl+C`.

## Usage

### Step 1: Initialize a New Project

To create a new Apache Airflow project, use the init command.
This command sets up the basic project structure, including configuration files,
directories, and sample DAGs.


```shell
airflowctl init <project_name> --airflow-version <version> --python-version <version>
```

Example:

```shell
airflowctl init my_airflow_project --airflow-version 2.6.3 --python-version 3.8
```

This creates a new project directory with the following structure:

```shell
my_airflow_project
├── .env
├── .gitignore
├── dags
│   └── example_dag_basic.py
├── plugins
├── requirements.txt
└── settings.yaml
```

Description of the files and directories:
- `.env` file contains the environment variables for the project.
- `.gitignore` file contains the default gitignore settings.
- `dags` directory contains the sample DAGs.
- `plugins` directory contains the sample plugins.
- `requirements.txt` file contains the project dependencies.
- `settings.yaml` file contains the project settings, including the project name,
Airflow version, Python version, and virtual environment path.

In our example `settings.yaml` file would look like this:

```yaml
# Airflow version to be installed
airflow_version: "2.6.3"

# Python version for the project
python_version: "3.8"


# Path to a virtual environment to be used for the project
mode:
  virtualenv:
    venv_path: "PROJECT_DIR/.venv"

# Airflow connections
connections:
    # Example connection
    # - conn_id: example
    #   conn_type: http
    #   host: http://example.com
    #   port: 80
    #   login: user
    #   password: pass
    #   schema: http
    #   extra:
    #      example_extra_field: example-value

# Airflow variables
variables:
    # Example variable
    # - key: example
    #   value: example-value
    #   description: example-description
```

Edit the `settings.yaml` file to customize the project settings.

### Step 2: Build the Project

The build command creates the virtual environment, installs the specified Apache Airflow
version, and sets up the project dependencies.

Run the build command from the project directory:

```shell
cd my_airflow_project
airflowctl build
```

The CLI relies on [`pyenv`](https://github.com/pyenv/pyenv) to download and install a Python version if the version is not already installed.

Example, if you have Python 3.8 installed but you specify Python 3.7 in the `settings.yaml` file,
the CLI will install Python 3.7 using `pyenv` and create a virtual environment with Python 3.7 first.

Optionally, you can choose custom virtual environment path in case you have already installed apache-airflow package
and other dependencies.
Pass the existing virtualenv path using `--venv_path` option to the `init` command or in `settings.yaml` file.
Make sure the existing virtualenv has same airflow and python version as your `settings.yaml` file states.

### Step 3: Start Airflow

To start Airflow services, use the start command.
This command activates the virtual environment and launches the Airflow web server and scheduler.

Example:

```shell
airflowctl start my_airflow_project
```

You can also start Airflow in the background with the `--background` flag:

```shell
airflowctl start my_airflow_project --background
```

### Step 4: Monitor Logs

To monitor logs from the background Airflow processes, use the logs command.
This command displays live logs and provides options to filter logs for specific components.

Example
```shell
airflowctl logs my_airflow_project
```

To filter logs for specific components:

```shell
# Filter logs for scheduler
airflowctl logs my_airflow_project -s

# Filter logs for webserver
airflowctl logs my_airflow_project -w

# Filter logs for triggerer
airflowctl logs my_airflow_project -t

# Filter logs for scheduler and webserver
airflowctl logs my_airflow_project -s -w
```

### Step 5: Stop Airflow

To stop Airflow services if they are still running, use the stop command.

Example:

```shell
airflowctl stop my_airflow_project
```

### Step 6: List Airflow Projects

To list all Airflow projects, use the list command.

Example:

```shell
airflowctl list
```

### Step 7: Show Project Info

To show project info, use the info command.

Example:

```shell
# From the project directory
airflowctl info

# From outside the project directory
airflowctl info my_airflow_project
```

### Step 8: Running Airflow commands

To run Airflow commands, use the `airflowctl airflow` command. All the commands after
`airflowctl airflow` are passed to the Airflow CLI.:

```shell
# From the project directory
airflowctl airflow <airflow_command>
```

Example:

```shell
$ airflowctl airflow version
2.6.3
```

You can also run `airflowctl airflow --help` to see the list of available commands.

```shell
$ airflowctl airflow --help
Usage: airflowctl airflow [OPTIONS] COMMAND [ARGS]...

  Run Airflow commands.

Positional Arguments:
  GROUP_OR_COMMAND

    Groups:
      celery         Celery components
      config         View configuration
      connections    Manage connections
      dags           Manage DAGs
      db             Database operations
      jobs           Manage jobs
      kubernetes     Tools to help run the KubernetesExecutor
      pools          Manage pools
      providers      Display providers
      roles          Manage roles
      tasks          Manage tasks
      users          Manage users
      variables      Manage variables

    Commands:
      cheat-sheet    Display cheat sheet
      dag-processor  Start a standalone Dag Processor instance
      info           Show information about current Airflow and environment
      kerberos       Start a kerberos ticket renewer
      plugins        Dump information about loaded plugins
      rotate-fernet-key
                     Rotate encrypted connection credentials and variables
      scheduler      Start a scheduler instance
      standalone     Run an all-in-one copy of Airflow
      sync-perm      Update permissions for existing roles and optionally DAGs
      triggerer      Start a triggerer instance
      version        Show the version
      webserver      Start a Airflow webserver instance

Options:
  -h, --help         show this help message and exit
```

Example:

```shell
# Listing dags
$ airflowctl airflow dags list
dag_id            | filepath             | owner   | paused
==================+======================+=========+=======
example_dag_basic | example_dag_basic.py | airflow | True


# Running standalone
$ airflowctl airflow standalone
```

Or you can activate the virtual environment first and then run the commands as shown below.

Example:

```shell
# From the project directory
source .venv/bin/activate

# Source all the environment variables
source .env
airflow version
```

To add a new DAG, add the DAG file to the `dags` directory.

To edit an existing DAG, edit the DAG file in the `dags` directory.
The changes will be reflected in the Airflow web server.

### Step 9: Changing Airflow Configurations

`airflowctl` by default uses SQLite as the backend database and `SequentialExecutor` as the executor.
However, if you want to use other databases or executors, you can stop the project and
either a) edit the `airflow.cfg` file or b) add environment variables to the `.env` file.

Example:

```shell
# Stop the project
airflowctl stop my_airflow_project

# Changing the executor to LocalExecutor
# Change the database to PostgreSQL if you already have it installed
echo "AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@localhost:5432/airflow" >> .env
echo "AIRFLOW__CORE__EXECUTOR=LocalExecutor" >> .env

# Start the project

airflowctl start my_airflow_project
```

Check the [Airflow documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html)
for all the available Airflow configurations.

#### Using Local Executor with SQLite
For Airflow >= 2.6, you can run `LocalExecutor` with `sqlite` as the backend database by
adding the following environment variable to the `.env` file:

```shell
_AIRFLOW__SKIP_DATABASE_EXECUTOR_COMPATIBILITY_CHECK=1
AIRFLOW__CORE__EXECUTOR=LocalExecutor
```

This should automatically happen for you when you run `airflowctl airflow` command if
Airflow version `==2.6.*`.

> [!WARNING]
> Sqlite is not recommended for production use. Use it only for development and testing only.

### Other Commands

For more information and options, you can use the `--help` flag with each command.

## Using with other Airflow tools

`airflowctl` can be used with other Airflow projects as long as the project structure is the same.

### Astro CLI

`airflowctl` can be used with [Astro CLI](https://github.com/astronomer/astro-cli) projects too.

While `airflowctl` is a tool that allows you to run Airflow locally using virtual environments, Astro CLI
allows you to run Airflow locally using docker.

`airflowctl` can read the `airflow_settings.yaml` file generated by Astro CLI for reading connections & variables. It
will then re-use it as `settings` file for `airflowctl`.

For example, if you have an Astro CLI project:
- Run the `airflowctl init . --build-start` command to initialize `airflowctl` from the project directory.
Press `y` to continue when prompted.
- It will then ask you for the Airflow version, enter the version you are using, by
default uses the latest Airflow version, press enter to continue
- It will use the installed Python version as the project's python version. If you
want to use a different Python version, you can specify it in the `airflow_settings.yaml` file in the
`python_version` field.

```shell
# From the project directory
$ cd astro_project
$ airflowctl init . --build-start
Directory /Users/xyz/astro_project is not empty. Continue? [y/N]: y
Project /Users/xyz/astro_project added to tracking.
Airflow project initialized in /Users/xyz/astro_project
Detected Astro project. Using Astro settings file (/Users/kaxilnaik/Desktop/proj1/astro_project/airflow_settings.yaml).
'airflow_version' not found in airflow_settings.yaml file. What is the Airflow version? [2.6.3]:
Virtual environment created at /Users/xyz/astro_project/.venv
...
...
```

If you see an error like the following, remove `airflow.cfg` file from the project directory and remove
`AIRFLOW_HOME` from `.env` file if it exists and try again.

```shell
Error: there might be a problem with your project starting up.
The webserver health check timed out after 1m0s but your project will continue trying to start.
Run 'astro dev logs --webserver | --scheduler' for details.
```

## License

This project is licensed under the terms of the [Apache 2.0 License](LICENSE)


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/kaxil/airflowctl",
    "name": "airflowctl",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7,<4.0",
    "maintainer_email": "",
    "keywords": "airflow,installer,cli,apache-airflow",
    "author": "Kaxil Naik",
    "author_email": "kaxilnaik@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/4c/28/8ffd8cdf081374a64730bc4b76beac9fd7d58fe1a5fabb7c61084b4989e2/airflowctl-0.2.9.tar.gz",
    "platform": null,
    "description": "# airflowctl\n\n[![PyPI](https://img.shields.io/pypi/v/airflowctl)](https://pypi.org/project/airflowctl/)\n[![License](https://img.shields.io/:license-Apache%202-blue.svg)](https://www.apache.org/licenses/LICENSE-2.0.txt)\n[![Python](https://img.shields.io/pypi/pyversions/airflowctl.svg)](https://pypi.python.org/pypi/airflowctl)\n[![PyPI - Downloads](https://img.shields.io/pypi/dm/airflowctl)](https://pypi.org/project/airflowctl/)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![pre-commit.ci status](https://results.pre-commit.ci/badge/github/kaxil/airflowctl/main.svg)](https://results.pre-commit.ci/latest/github/kaxil/airflowctl/main)\n\n`airflowctl` is a command-line tool for managing Apache Airflow\u2122 projects.\nIt provides a set of commands to initialize, build, start, stop, and manage Airflow projects.\nWith `airflowctl`, you can easily set up and manage your Airflow projects, install\nspecific versions of Apache Airflow, and manage virtual environments.\n\nThe main goal of `airflowctl` is for first-time Airflow users to install and setup Airflow using a single command and\nfor existing Airflow users to manage multiple Airflow projects with different Airflow versions on the same machine.\n\n## Features\n\n- **Project Initialization with Connections & Variables:** Initialize a new Airflow project with customizable project name, Apache Airflow\nversion, and Python version. It also allows you to manage Airflow connections and variables.\n- **Automatic Virtual Environment Management:** Automatically create and manage virtual environments for\nyour Airflow projects, even for Python versions that are not installed on your system.\n- **Airflow Version Management:** Install and manage specific versions of Apache Airflow.\n- **Background Process Management:** Start and stop Airflow in the background with process management capabilities.\n- **Live Logs Display:** Continuously display live logs of background Airflow processes with optional log filtering.\n\n## Table of Contents\n\n- [Installation](#installation)\n- [Quickstart](#quickstart)\n- [Usage](#usage)\n  - [Step 1: Initialize a New Project](#step-1-initialize-a-new-project)\n  - [Step 2: Build the Project](#step-2-build-the-project)\n  - [Step 3: Start Airflow](#step-3-start-airflow)\n  - [Step 4: Monitor Logs](#step-4-monitor-logs)\n  - [Step 5: Stop Airflow](#step-5-stop-airflow)\n  - [Step 6: List Airflow Projects](#step-6-list-airflow-projects)\n  - [Step 7: Show Project Info](#step-7-show-project-info)\n  - [Step 8: Running Airflow Commands](#step-8-running-airflow-commands)\n  - [Step 9: Changing Airflow configuration](#step-9-changing-airflow-configurations)\n- [Using with other Airflow tools](#using-with-other-airflow-tools)\n  - [Astro CLI](#astro-cli)\n\n## Installation\n\n```bash\npip install airflowctl\n```\n\n## Quickstart\n\nTo initialize a new Airflow project with the latest airflow version, build a Virtual environment\nand run the project, run the following command:\n\n```shell\nairflowctl init my_airflow_project --build-start\n```\n\nThis will start Airflow and display the logs in the terminal. You can\naccess the Airflow UI at http://localhost:8080. To stop Airflow, press `Ctrl+C`.\n\n## Usage\n\n### Step 1: Initialize a New Project\n\nTo create a new Apache Airflow project, use the init command.\nThis command sets up the basic project structure, including configuration files,\ndirectories, and sample DAGs.\n\n\n```shell\nairflowctl init <project_name> --airflow-version <version> --python-version <version>\n```\n\nExample:\n\n```shell\nairflowctl init my_airflow_project --airflow-version 2.6.3 --python-version 3.8\n```\n\nThis creates a new project directory with the following structure:\n\n```shell\nmy_airflow_project\n\u251c\u2500\u2500 .env\n\u251c\u2500\u2500 .gitignore\n\u251c\u2500\u2500 dags\n\u2502   \u2514\u2500\u2500 example_dag_basic.py\n\u251c\u2500\u2500 plugins\n\u251c\u2500\u2500 requirements.txt\n\u2514\u2500\u2500 settings.yaml\n```\n\nDescription of the files and directories:\n- `.env` file contains the environment variables for the project.\n- `.gitignore` file contains the default gitignore settings.\n- `dags` directory contains the sample DAGs.\n- `plugins` directory contains the sample plugins.\n- `requirements.txt` file contains the project dependencies.\n- `settings.yaml` file contains the project settings, including the project name,\nAirflow version, Python version, and virtual environment path.\n\nIn our example `settings.yaml` file would look like this:\n\n```yaml\n# Airflow version to be installed\nairflow_version: \"2.6.3\"\n\n# Python version for the project\npython_version: \"3.8\"\n\n\n# Path to a virtual environment to be used for the project\nmode:\n  virtualenv:\n    venv_path: \"PROJECT_DIR/.venv\"\n\n# Airflow connections\nconnections:\n    # Example connection\n    # - conn_id: example\n    #   conn_type: http\n    #   host: http://example.com\n    #   port: 80\n    #   login: user\n    #   password: pass\n    #   schema: http\n    #   extra:\n    #      example_extra_field: example-value\n\n# Airflow variables\nvariables:\n    # Example variable\n    # - key: example\n    #   value: example-value\n    #   description: example-description\n```\n\nEdit the `settings.yaml` file to customize the project settings.\n\n### Step 2: Build the Project\n\nThe build command creates the virtual environment, installs the specified Apache Airflow\nversion, and sets up the project dependencies.\n\nRun the build command from the project directory:\n\n```shell\ncd my_airflow_project\nairflowctl build\n```\n\nThe CLI relies on [`pyenv`](https://github.com/pyenv/pyenv) to download and install a Python version if the version is not already installed.\n\nExample, if you have Python 3.8 installed but you specify Python 3.7 in the `settings.yaml` file,\nthe CLI will install Python 3.7 using `pyenv` and create a virtual environment with Python 3.7 first.\n\nOptionally, you can choose custom virtual environment path in case you have already installed apache-airflow package\nand other dependencies.\nPass the existing virtualenv path using `--venv_path` option to the `init` command or in `settings.yaml` file.\nMake sure the existing virtualenv has same airflow and python version as your `settings.yaml` file states.\n\n### Step 3: Start Airflow\n\nTo start Airflow services, use the start command.\nThis command activates the virtual environment and launches the Airflow web server and scheduler.\n\nExample:\n\n```shell\nairflowctl start my_airflow_project\n```\n\nYou can also start Airflow in the background with the `--background` flag:\n\n```shell\nairflowctl start my_airflow_project --background\n```\n\n### Step 4: Monitor Logs\n\nTo monitor logs from the background Airflow processes, use the logs command.\nThis command displays live logs and provides options to filter logs for specific components.\n\nExample\n```shell\nairflowctl logs my_airflow_project\n```\n\nTo filter logs for specific components:\n\n```shell\n# Filter logs for scheduler\nairflowctl logs my_airflow_project -s\n\n# Filter logs for webserver\nairflowctl logs my_airflow_project -w\n\n# Filter logs for triggerer\nairflowctl logs my_airflow_project -t\n\n# Filter logs for scheduler and webserver\nairflowctl logs my_airflow_project -s -w\n```\n\n### Step 5: Stop Airflow\n\nTo stop Airflow services if they are still running, use the stop command.\n\nExample:\n\n```shell\nairflowctl stop my_airflow_project\n```\n\n### Step 6: List Airflow Projects\n\nTo list all Airflow projects, use the list command.\n\nExample:\n\n```shell\nairflowctl list\n```\n\n### Step 7: Show Project Info\n\nTo show project info, use the info command.\n\nExample:\n\n```shell\n# From the project directory\nairflowctl info\n\n# From outside the project directory\nairflowctl info my_airflow_project\n```\n\n### Step 8: Running Airflow commands\n\nTo run Airflow commands, use the `airflowctl airflow` command. All the commands after\n`airflowctl airflow` are passed to the Airflow CLI.:\n\n```shell\n# From the project directory\nairflowctl airflow <airflow_command>\n```\n\nExample:\n\n```shell\n$ airflowctl airflow version\n2.6.3\n```\n\nYou can also run `airflowctl airflow --help` to see the list of available commands.\n\n```shell\n$ airflowctl airflow --help\nUsage: airflowctl airflow [OPTIONS] COMMAND [ARGS]...\n\n  Run Airflow commands.\n\nPositional Arguments:\n  GROUP_OR_COMMAND\n\n    Groups:\n      celery         Celery components\n      config         View configuration\n      connections    Manage connections\n      dags           Manage DAGs\n      db             Database operations\n      jobs           Manage jobs\n      kubernetes     Tools to help run the KubernetesExecutor\n      pools          Manage pools\n      providers      Display providers\n      roles          Manage roles\n      tasks          Manage tasks\n      users          Manage users\n      variables      Manage variables\n\n    Commands:\n      cheat-sheet    Display cheat sheet\n      dag-processor  Start a standalone Dag Processor instance\n      info           Show information about current Airflow and environment\n      kerberos       Start a kerberos ticket renewer\n      plugins        Dump information about loaded plugins\n      rotate-fernet-key\n                     Rotate encrypted connection credentials and variables\n      scheduler      Start a scheduler instance\n      standalone     Run an all-in-one copy of Airflow\n      sync-perm      Update permissions for existing roles and optionally DAGs\n      triggerer      Start a triggerer instance\n      version        Show the version\n      webserver      Start a Airflow webserver instance\n\nOptions:\n  -h, --help         show this help message and exit\n```\n\nExample:\n\n```shell\n# Listing dags\n$ airflowctl airflow dags list\ndag_id            | filepath             | owner   | paused\n==================+======================+=========+=======\nexample_dag_basic | example_dag_basic.py | airflow | True\n\n\n# Running standalone\n$ airflowctl airflow standalone\n```\n\nOr you can activate the virtual environment first and then run the commands as shown below.\n\nExample:\n\n```shell\n# From the project directory\nsource .venv/bin/activate\n\n# Source all the environment variables\nsource .env\nairflow version\n```\n\nTo add a new DAG, add the DAG file to the `dags` directory.\n\nTo edit an existing DAG, edit the DAG file in the `dags` directory.\nThe changes will be reflected in the Airflow web server.\n\n### Step 9: Changing Airflow Configurations\n\n`airflowctl` by default uses SQLite as the backend database and `SequentialExecutor` as the executor.\nHowever, if you want to use other databases or executors, you can stop the project and\neither a) edit the `airflow.cfg` file or b) add environment variables to the `.env` file.\n\nExample:\n\n```shell\n# Stop the project\nairflowctl stop my_airflow_project\n\n# Changing the executor to LocalExecutor\n# Change the database to PostgreSQL if you already have it installed\necho \"AIRFLOW__DATABASE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@localhost:5432/airflow\" >> .env\necho \"AIRFLOW__CORE__EXECUTOR=LocalExecutor\" >> .env\n\n# Start the project\n\nairflowctl start my_airflow_project\n```\n\nCheck the [Airflow documentation](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html)\nfor all the available Airflow configurations.\n\n#### Using Local Executor with SQLite\nFor Airflow >= 2.6, you can run `LocalExecutor` with `sqlite` as the backend database by\nadding the following environment variable to the `.env` file:\n\n```shell\n_AIRFLOW__SKIP_DATABASE_EXECUTOR_COMPATIBILITY_CHECK=1\nAIRFLOW__CORE__EXECUTOR=LocalExecutor\n```\n\nThis should automatically happen for you when you run `airflowctl airflow` command if\nAirflow version `==2.6.*`.\n\n> [!WARNING]\n> Sqlite is not recommended for production use. Use it only for development and testing only.\n\n### Other Commands\n\nFor more information and options, you can use the `--help` flag with each command.\n\n## Using with other Airflow tools\n\n`airflowctl` can be used with other Airflow projects as long as the project structure is the same.\n\n### Astro CLI\n\n`airflowctl` can be used with [Astro CLI](https://github.com/astronomer/astro-cli) projects too.\n\nWhile `airflowctl` is a tool that allows you to run Airflow locally using virtual environments, Astro CLI\nallows you to run Airflow locally using docker.\n\n`airflowctl` can read the `airflow_settings.yaml` file generated by Astro CLI for reading connections & variables. It\nwill then re-use it as `settings` file for `airflowctl`.\n\nFor example, if you have an Astro CLI project:\n- Run the `airflowctl init . --build-start` command to initialize `airflowctl` from the project directory.\nPress `y` to continue when prompted.\n- It will then ask you for the Airflow version, enter the version you are using, by\ndefault uses the latest Airflow version, press enter to continue\n- It will use the installed Python version as the project's python version. If you\nwant to use a different Python version, you can specify it in the `airflow_settings.yaml` file in the\n`python_version` field.\n\n```shell\n# From the project directory\n$ cd astro_project\n$ airflowctl init . --build-start\nDirectory /Users/xyz/astro_project is not empty. Continue? [y/N]: y\nProject /Users/xyz/astro_project added to tracking.\nAirflow project initialized in /Users/xyz/astro_project\nDetected Astro project. Using Astro settings file (/Users/kaxilnaik/Desktop/proj1/astro_project/airflow_settings.yaml).\n'airflow_version' not found in airflow_settings.yaml file. What is the Airflow version? [2.6.3]:\nVirtual environment created at /Users/xyz/astro_project/.venv\n...\n...\n```\n\nIf you see an error like the following, remove `airflow.cfg` file from the project directory and remove\n`AIRFLOW_HOME` from `.env` file if it exists and try again.\n\n```shell\nError: there might be a problem with your project starting up.\nThe webserver health check timed out after 1m0s but your project will continue trying to start.\nRun 'astro dev logs --webserver | --scheduler' for details.\n```\n\n## License\n\nThis project is licensed under the terms of the [Apache 2.0 License](LICENSE)\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A CLI tool to streamline getting started with Apache Airflow\u2122 and managing multiple Airflow projects.",
    "version": "0.2.9",
    "project_urls": {
        "Homepage": "https://github.com/kaxil/airflowctl",
        "Repository": "https://github.com/kaxil/airflowctl"
    },
    "split_keywords": [
        "airflow",
        "installer",
        "cli",
        "apache-airflow"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "52fcae2493b666fa4f725cba59c72975faf650de4507c857b80e6dc5bbe71b96",
                "md5": "9716c493554efb9366ec671818ced6ad",
                "sha256": "c247fef0e2271bd324dae8fa95156bcbe8234cc58848400cdde4a78a0312ca91"
            },
            "downloads": -1,
            "filename": "airflowctl-0.2.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9716c493554efb9366ec671818ced6ad",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7,<4.0",
            "size": 27774,
            "upload_time": "2023-10-05T02:13:20",
            "upload_time_iso_8601": "2023-10-05T02:13:20.787157Z",
            "url": "https://files.pythonhosted.org/packages/52/fc/ae2493b666fa4f725cba59c72975faf650de4507c857b80e6dc5bbe71b96/airflowctl-0.2.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4c288ffd8cdf081374a64730bc4b76beac9fd7d58fe1a5fabb7c61084b4989e2",
                "md5": "e3fe59f4f7ef22f6b9eef9725a6e4904",
                "sha256": "258b991bc096305c31ac960ca46e1260c1b60eec37bb37796076cf82c11e88d8"
            },
            "downloads": -1,
            "filename": "airflowctl-0.2.9.tar.gz",
            "has_sig": false,
            "md5_digest": "e3fe59f4f7ef22f6b9eef9725a6e4904",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7,<4.0",
            "size": 26940,
            "upload_time": "2023-10-05T02:13:22",
            "upload_time_iso_8601": "2023-10-05T02:13:22.576011Z",
            "url": "https://files.pythonhosted.org/packages/4c/28/8ffd8cdf081374a64730bc4b76beac9fd7d58fe1a5fabb7c61084b4989e2/airflowctl-0.2.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-10-05 02:13:22",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kaxil",
    "github_project": "airflowctl",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "airflowctl"
}
        
Elapsed time: 0.11915s