=======
argcmdr
=======
The thin ``argparse`` wrapper for quick, clear and easy declaration of (hierarchical) console command interfaces via Python.
``argcmdr``:
* handles the boilerplate of CLI
* while maintaining the clarity and extensibility of your code
* without requiring you to learn Yet Another argument-definition syntax
* without reinventing the wheel or sacrificing the flexibility of ``argparse``
* enables invocation via
* executable script (``__name__ == '__main__'``)
* ``setuptools`` entrypoint
* command-defining module (like the ``Makefile`` of ``make``)
* determines command hierarchy flexibly and cleanly
* command declarations are nested to indicate CLI hierarchy *or*
* commands are decorated to indicate their hierarchy
* includes support for elegant interaction with the operating system, via ``plumbum``
Setup
=====
``argcmdr`` is developed for Python version 3.6.3 and above, and may be built via ``setuptools``.
Python
------
If Python 3.6.3 or greater is not installed on your system, it is available from python.org_.
However, depending on your system, you might prefer to install Python via a package manager, such as Homebrew_ on Mac OS X or APT on Debian-based Linux systems.
Alternatively, pyenv_ is highly recommended to manage arbitrary installations of Python, and may be most easily installed via the `pyenv installer`_.
argcmdr
-------
To install from PyPI::
pip install argcmdr
To install from Github::
pip install git+https://github.com/dssg/argcmdr.git
To install from source::
python setup.py install
Tutorial
========
.. contents::
:local:
The Command
-----------
``argcmdr`` is built around the base class ``Command``. Your console command extends ``Command``, and optionally defines:
* ``__init__(parser)``, which adds to the parser the arguments that your command requires, as supported by ``argparse`` (see argparse_)
* ``__call__([args, parser, ...])``, which is invoked when your console command is invoked, and which is expected to implement your command's functionality
For example, let's define the executable file ``listdir``, a trivial script which prints the current directory's contents::
#!/usr/bin/env python
import os
from argcmdr import Command, main
class Main(Command):
"""print the current directory's contents"""
def __call__(self):
print(*os.listdir())
if __name__ == '__main__':
main(Main)
Should we execute this script, it will perform much the same as ``ls -A``.
Let's say, however, that we would like to optionally print each item of the directory's contents on a separate line::
class Main(Command):
"""print the current directory's contents"""
def __init__(self, parser):
parser.add_argument(
'-1',
action='store_const',
const='\n',
default=' ',
dest='sep',
help='list one file per line',
)
def __call__(self, args):
print(*os.listdir(), sep=args.sep)
We now optionally support execution similar to ``ls -A1``, via ``listdir -1``.
Fittingly, this is reflected in the script's autogenerated usage text – ``listdir -h`` prints::
usage: listdir [-h] [--tb] [-1]
print the current directory's contents
optional arguments:
-h, --help show this help message and exit
--tb, --traceback print error tracebacks
-1 list one file per line
The command decorator
---------------------
For particularly trivial commands, the class declaration syntax may be considered verbose and unnecessary. The ``@cmd`` decorator manufactures the appropriate ``Command`` from a decorated function or method.
The first command may be rewritten to produce an identical result::
from argcmdr import cmd
@cmd
def main():
"""print the current directory's contents"""
print(*os.listdir())
and, for the second, ``cmd`` optionally accepts an ``argparse`` argument definition::
@cmd('-1', action='store_const', const='\n', default=' ', dest='sep', help='list one file per line')
def main(args):
"""print the current directory's contents"""
print(*os.listdir(), sep=args.sep)
Further arguments may be added via additional decoration::
@cmd('-a', ...)
@cmd('-1', ...)
def main(args):
...
Local execution
---------------
As much as we gain from Python and its standard library, it's quite typical to need to spawn non-Python subprocesses, and for that matter for your script's purpose to be entirely to orchestrate workflows built from operating system commands. Python's – and argcmdr's – benefit is to make this work easier, debuggable, testable and scalable.
In fact, our above, trivial example could be accomplished easily with direct execution of ``ls``::
import argparse
from argcmdr import Local, main
class Main(Local):
"""list directory contents"""
def __init__(self, parser):
parser.add_argument(
'remainder',
metavar='arguments for ls',
nargs=argparse.REMAINDER,
)
def __call__(self, args):
print(self.local['ls'](args.remainder))
``local``, bound to the ``Local`` base class, is a dictionary which caches path look-ups for system executables.
This could, however, still be cleaner. For this reason, the ``Local`` command features a parallel invocation interface, ``prepare([args, parser, local, ...])``::
class Main(Local):
"""list directory contents"""
def __init__(self, parser):
parser.add_argument(
'remainder',
metavar='arguments for ls',
nargs=argparse.REMAINDER,
)
def prepare(self, args):
return self.local['ls'][args.remainder]
Via the ``prepare`` interface, standard output is printed by default, and your command logic may be tested in a "dry run," as reflected in the usage output of the above::
usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ...
list directory contents
positional arguments:
arguments for ls
optional arguments:
-h, --help show this help message and exit
--tb, --traceback print error tracebacks
-q, --quiet do not print command output
-d, --dry-run do not execute commands, but print what they are (unless
--no-show is provided)
-s, --show print command expressions (by default not printed unless
dry-run)
--no-show do not print command expressions (by default not printed
unless dry-run)
To execute multiple local subprocesses, ``prepare`` may either return an iterable (*e.g.* ``list``) of the above ``plumbum`` bound commands, or ``prepare`` may be defined as a generator function, (*i.e.* make repeated use of ``yield`` – see below).
Managing execution
~~~~~~~~~~~~~~~~~~
Handling exit codes
+++++++++++++++++++
Subprocess commands emitted by ``Local.prepare`` are executed in order and, by default, failed execution is interrupted by a raised exception::
class Release(Local):
"""release the package to pypi"""
def __init__(self, parser):
parser.add_argument(
'part',
choices=('major', 'minor', 'patch'),
help="part of the version to be bumped",
)
def prepare(self, args):
yield self.local['bumpversion'][args.part]
yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']
yield self.local['twine']['upload', 'dist/*']
Should the ``bumpversion`` command fail, the ``deploy`` command will not proceed.
In some cases, however, we might like to disable this functionality, and proceed regardless of a subprocess's exit code. We may pass arguments such as ``retcode`` to ``plumbum`` by setting this attribute on the ``prepare`` method::
def prepare(self, args):
yield self.local['bumpversion'][args.part]
yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']
yield self.local['twine']['upload', 'dist/*']
prepare.retcode = None
Subprocess commands emitted by the above method will not raise execution exceptions, regardless of their exit code. (To allow only certain exit code(s), set ``retcode`` as appropriate – see plumbum_.)
Inspecting results
++++++++++++++++++
Having disabled execution exceptions – and regardless – we might need to inspect a subprocess command's exit code, standard output or standard error. As such, (whether we manipulate ``retcode`` or not), ``argcmdr`` communicates these command results with ``prepare`` generator methods::
def prepare(self, args):
(code, out, err) = yield self.local['bumpversion']['--list', args.part]
yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']
if out is None:
version = 'DRY-RUN'
else:
(version_match,) = re.finditer(
r'^new_version=([\d.]+)$',
out,
re.M,
)
version = version_match.group(1)
yield self.local['twine']['upload', f'dist/*{version}*']
In the above, ``prepare`` stores the results of ``bumpversion`` execution, in order to determine from its standard output the version to be released.
Handling exceptions
+++++++++++++++++++
Moreover, we might like to define special handling for execution errors; and, perhaps rather than manipulate ``retcode`` for all commands emitted by our method, we might like to handle them separately. As such, execution exceptions are also communicated back to ``prepare`` generators::
def prepare(self, args):
try:
(_code, out, _err) = yield self.local['bumpversion']['--list', args.part]
except self.local.ProcessExecutionError:
print("execution failed but here's a joke ...")
...
Modifying execution
+++++++++++++++++++
Commands are run in the foreground by default, their outputs printed, as well as recorded for inspection, via the ``plumbum`` modifier, ``TEE``.
To execute a command in the background (and continue), we may specify the ``BG`` modifier::
def prepare(self, args):
future = yield (self.local.BG, self.local['bumpversion']['--list', args.part])
Alternatively, we may wish to execute a command in the foreground *only*, (and not record its output) – *e.g.* to best support processes which require TTY::
def prepare(self):
return (self.local.FG, self.local['ipython']['-i', 'startup.py'])
The local decorator
~~~~~~~~~~~~~~~~~~~
``Local`` is an alternate command base class, and a subclass of ``Command``. Any base class may be substituted for ``Command`` when using the command decorator::
@cmd(base=CustomCommand)
def main():
...
Moreover, ``Local`` functionality may be requested via keyword flag ``local``::
@cmd(local=True)
def main(self):
...
And in support of the above, common case, the ``@local`` decorator is provided::
from argcmdr import local
@local
def main(self):
...
Note that in the last two examples, our command function's call signature included ``self``.
Decorated command functions are in fact replaced with manufactured subclasses of ``Command``, and the function is invoked as this command's functionality – either ``__call__`` or ``prepare``. It is assumed that, by default, this function should be treated as a ``staticmethod``, and given no reference to the manufactured ``Command`` instance. However, in the case of ``local`` decoration, this is not the case; the binding is left up to the decorated object, which, according to Python descriptor rules, means that a decorated function is treated as a "method" and receives the instance. This way, ``local`` command functions may access the instance's ``local`` dictionary of operating system executables.
Binding may be explicitly controlled via the decorator keyword ``binding``, *e.g.*::
@cmd(binding=True, base=CustomCommand)
def main(self):
...
See `Method commands`_ for further examples of decorator-defined commands and alternative bindings.
Command invocation signature
----------------------------
Note that in our last trivial examples of listing directory contents, we made our script dependent upon the ``ls`` command in the operating environment. ``argcmdr`` will not, by default, print tracebacks, and it will colorize unhandled exceptions; however, we might prefer to print a far friendlier error message.
One easy way of printing friendly error messages is to make use of ``argparse.ArgumentParser.error()``. As we've seen, ``Command`` invocation, via either ``__call__`` or ``prepare``, may accept zero arguments, or it may require the parsed arguments ``argparse.Namespace``. Moreover, it may require a second argument to receive the argument parser, and a third argument to receive the ``local`` dictionary::
class Main(Local):
"""list directory contents"""
def __init__(self, parser):
parser.add_argument(
'remainder',
metavar='arguments for ls',
nargs=argparse.REMAINDER,
)
def prepare(self, args, parser, local):
try:
local_exec = local['ls']
except local.CommandNotFound:
parser.error('command not available')
yield local_exec[args.remainder]
If ``ls`` is not available, the user is presented the following message upon executing the above::
usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ...
listdir: error: command not available
Access to the parsed argument namespace
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The command invocation's parsed arguments are most straight-forwardly accessible as the first argument of the ``Command`` invocation signature, either ``__call__`` or ``prepare``. However, in less-than-trivial implementations, wherein command methods are factored for reusability, passing the argument namespace from method to method may become tedious. To support such scenarios, this object is made additionally available via the ``Command`` *property*, ``args``.
Consider a class of commands which require a database password. We don't want to store this password anywhere in plain text; rather, we expect it to be input, either via (piped) standard input or the TTY::
class DbSync(Command):
"""sync databases"""
def __init__(self, parser):
parser.add_argument(
'-p', '--password',
action='store_true',
dest='stdin_password',
default=False,
help="read database password from standard input",
)
def __call__(self, args):
engine = self.dbengine(args)
...
def dbcreds(self, args):
dbcreds = {
'username': os.getenv('PGUSER'),
'host': os.getenv('PGHOST'),
'port': os.getenv('PGPORT'),
'database': os.getenv('PGDATABASE'),
}
missing = [key for (key, value) in dbcreds.items() if not value]
if missing:
raise RuntimeError(
"database connection information missing from "
"environmental configuration: " + ', '.join(missing)
)
if args.stdin_password:
dbcreds['password'] = sys.stdin.read().rstrip('\n\r')
# we're done with the (pipe) stdin, so force it back to TTY for
# any subsequent input()
sys.stdin = open('/dev/tty')
else:
dbcreds['password'] = os.getenv('PGPASSWORD')
if not dbcreds['password']:
dbcreds['password'] = getpass.getpass(
'enter password for '
+ ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold)
+ ': '
| colors.yellow
)
return dbcreds
def dburi(self, args):
return sqlalchemy.engine.url.URL('postgres', **self.dbcreds(args))
def dbengine(self, args):
return sqlalchemy.create_engine(self.dburi(args))
Not only were we forced to verbosely daisy-chain the arguments namespace, ``args``, from method to method; moreover, we were prevented from (trivially) caching the result of ``dbcreds``, to ensure that the password isn't ever requested more than once.
Now, let's reimplement the above, making use of the property ``args``::
class DbSync(Command):
"""sync databases"""
def __init__(self, parser):
parser.add_argument(
'-p', '--password',
action='store_true',
dest='stdin_password',
default=False,
help="read database password from standard input",
)
def __call__(self):
engine = self.dbengine
...
@cachedproperty
def dbcreds(self):
dbcreds = {
'username': os.getenv('PGUSER'),
'host': os.getenv('PGHOST'),
'port': os.getenv('PGPORT'),
'database': os.getenv('PGDATABASE'),
}
missing = [key for (key, value) in dbcreds.items() if not value]
if missing:
raise RuntimeError(
"database connection information missing from "
"environmental configuration: " + ', '.join(missing)
)
if self.args.stdin_password:
dbcreds['password'] = sys.stdin.read().rstrip('\n\r')
# we're done with the (pipe) stdin, so force it back to TTY for
# any subsequent input()
sys.stdin = open('/dev/tty')
else:
dbcreds['password'] = os.getenv('PGPASSWORD')
if not dbcreds['password']:
dbcreds['password'] = getpass.getpass(
'enter password for '
+ ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold)
+ ': '
| colors.yellow
)
return dbcreds
@property
def dburi(self):
return sqlalchemy.engine.url.URL('postgres', **self.dbcreds)
@property
def dbengine(self):
return sqlalchemy.create_engine(self.dburi)
In this form, ``args`` needn't be passed from method to method; in fact, methods of the ``DbSync`` command needn't worry about arguments which don't directly interest them at all. And, using ``cachedproperty`` from Dickens_, the database credentials are trivially cached, ensuring they aren't needlessly re-requested.
Note that attempting to access the ``args`` property before invocation arguments have been parsed – *e.g.* within ``__init__`` – is not allowed, and will raise ``RuntimeError``.
Access to the parser
~~~~~~~~~~~~~~~~~~~~
In addition to ``args``, the ``parser`` associated with the command may alternatively be retrieved via its ``parser`` property.
Similar to ``args``, the ``parser`` is not available until the command has been initialized; however, this property *may* be used *within* ``__init__``, so long as the base ``__init__`` has been invoked (*e.g.* via ``super().__init__``).
Command hierarchy
-----------------
Our tools should be modular and composable, favoring atomicity over monolithism. Nevertheless, well-designed, -structured and -annotated code and application interfaces pay their users and developers tremendous dividends over time – no less in the case of more extensive interfaces, and particularly so for project management libraries (consider the ``Makefile``).
``argcmdr`` intends to facilitate the definition of ``argparse``-based interfaces regardless of their structure. But it's in multi-level, or hierarchical, command argumentation that ``argcmdr`` shines.
Nested commands
~~~~~~~~~~~~~~~
Rather than procedurally defining subparsers, ``Command`` class declarations may simply be nested.
Let's define an executable file ``manage`` for managing a codebase::
#!/usr/bin/env python
import os
from argcmdr import Local, main
class Management(Local):
"""manage deployment"""
def __init__(self, parser):
parser.add_argument(
'-e', '--env',
choices=('development', 'production'),
default='development',
help="target environment",
)
class Build(Local):
"""build app"""
def prepare(self, args):
req_path = os.path.join('requirements', f'{args.env}.txt')
yield self.local['pip']['-r', req_path]
class Deploy(Local):
"""deploy app"""
def prepare(self, args):
yield self.local['eb']['deploy', args.env]
if __name__ == '__main__':
main(Management)
``Local`` command ``Management``, above, defines no functionality of its own. As such, executing ``manage`` without arguments prints its autogenerated usage::
usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show]
[-e {development,production}]
{build,deploy} ...
Because ``Management`` extends ``Local``, it inherits argumentation controlling whether standard output is printed and offering to run commands in "dry" mode. (Note, however, that it could have omitted these options by extending ``Command``. Moreover, it may override class method ``base_parser()``.)
``Management`` adds to the basic interface the optional argument ``--env``. Most important, however, are the related, nested commands ``Build`` and ``Deploy``, which define functionality via ``prepare``. Neither nested command extends its subparser – though they could; but rather, they depend upon the common argumentation defined by ``Management``.
Exploring the interface via ``--help`` tells us a great deal, for example ``manage -h``::
usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show]
[-e {development,production}]
{build,deploy} ...
manage deployment
optional arguments:
-h, --help show this help message and exit
--tb, --traceback print error tracebacks
-q, --quiet do not print command output
-d, --dry-run do not execute commands, but print what they are
(unless --no-show is provided)
-s, --show print command expressions (by default not printed
unless dry-run)
--no-show do not print command expressions (by default not
printed unless dry-run)
-e {development,production}, --env {development,production}
target environment
management commands:
{build,deploy} available commands
build build app
deploy deploy app
And ``manage deploy -h``::
usage: manage deploy [-h]
deploy app
optional arguments:
-h, --help show this help message and exit
As such, a "dry run"::
manage -de production deploy
prints the following::
> /home/user/.local/bin/eb deploy production
and without the dry-run flag the above operating system command is executed.
Root commands
~~~~~~~~~~~~~
There is no artificial limit to the number of levels you may add to your command hierarchy. However, application interfaces are commonly "wider" than they are "deep". For this reason, as an alternative to class-nesting, the hierarchical relationship may be defined by a class decorator provided by the ``RootCommand``.
Let's define the executable file ``git`` with no particular purpose whatsoever::
#!/usr/bin/env python
from argcmdr import Command, RootCommand, main
class Git(RootCommand):
"""another stupid content tracker"""
def __init__(self, parser):
parser.add_argument(
'-C',
default='.',
dest='path',
help="run as if git was started in <path> instead of the current "
"working directory.",
)
@Git.register
class Stash(Command):
"""stash the changes in a dirty working directory away"""
def __call__(self, args):
self['save'].delegate()
class Save(Command):
"""save your local modifications to a new stash"""
def __init__(self, parser):
parser.add_argument(
'-p', '--patch',
dest='interactive',
action='store_true',
default=False,
help="interactively select hunks from the diff between HEAD "
"and the working tree to be stashed",
)
def __call__(self, args):
print("stash save", f"(interactive: {args.interactive})")
class List(Command):
"""list the stashes that you currently have"""
def __call__(self):
print("stash list")
if __name__ == '__main__':
main(Git)
We anticipate adding many subcommands to ``git`` beyond ``stash``; and so, rather than nest all of these command classes under ``Git``:
* we've defined ``Git`` as a ``RootCommand``
* we've declared ``Stash`` at the module root
* we've decorated ``Stash`` with ``Git.register``
The ``RootCommand`` functions identically to the ``Command``; it only adds this ability to extend the listing of its subcommands by those registered via its decorator. (Notably, ``LocalRoot`` composes the functionaliy of ``Local`` and ``RootCommand`` via multiple inheritance.)
The ``stash`` command, on the other hand, has opted to contain the entirety of its hierarchical functionality, nesting its own subcommands ``list`` and ``save``.
Nevertheless, you are not limited to a single ``RootCommand``. Any command whose hierarchy you would like to extend via the ``register`` decorator may inherit it. Moreover, the ``@cmd`` decorator accepts the keyword flag ``root``.
Method commands
~~~~~~~~~~~~~~~
Decorator-manufactured commands are no less capable than those derived from class declaration syntax, *except* in that other commands cannot, syntactically, be nested beneath them. (For that reason the ``@cmd`` decorator's ``root`` flag is of note.) Decorator-manufactured commands can nonetheless themselves extend hierarchies, either by being further decorated by ``register`` or nested under command class declarations::
@Git.register
class Stash(Command):
"""stash the changes in a dirty working directory away"""
def __call__(self, args):
self['save'].delegate()
@cmd('-p', '--patch', dest='interactive', action='store_true', default=False,
help="interactively select hunks from the diff between HEAD "
"and the working tree to be stashed")
def save(args):
"""save your local modifications to a new stash"""
print("stash save", f"(interactive: {args.interactive})")
@cmd
def list():
"""list the stashes that you currently have"""
print("stash list")
Above we've rewritten the trivial ``stash`` commands ``save`` and ``list`` as ``@cmd``-decorated functions.
Say, however, that we needed to invert the factoring of ``save`` logic between that command and its parent::
@Git.register
class Stash(Command):
"""stash the changes in a dirty working directory away"""
def perform_save(self, interactive=False):
print("stash save", f"(interactive: {interactive})")
def __call__(self):
self.perform_save()
@cmd('-p', '--patch', dest='interactive', action='store_true', default=False,
help="interactively select hunks from the diff between HEAD "
"and the working tree to be stashed")
@cmd(binding=True)
def save(self, args):
"""save your local modifications to a new stash"""
self[-1].perform_save(args.interactive)
@cmd
def list():
"""list the stashes that you currently have"""
print("stash list")
(Note that ``cmd`` can accept both an ``argparse`` argument specification and command feature-defining arguments at once; however, this is of use mainly to the definition of helpers such as the ``local`` decorator, as this style is difficult to read and otherwise discouraged. Moreover, only the **first** – *i.e.* inner-most – ``cmd`` decorator's command features are respected.)
In this version, ``save`` functionality is shared as a method of ``Stash``. ``save`` is able to access this method only by ascending the command hierarchy. This might make particular sense when multiple nested commands must share functionality, which is defined on the command class under which they are nested. (Note, however, that in such a case as this one, where the shared method *could* be defined as a ``staticmethod``, it is no less advisable to do so, and for nested commands to access it directly as, *e.g.* ``Stash.perform_save``.)
Our above reference to ``self`` in ``save``, however, is at first glance misleading. This command *looks* like an instance method of ``Stash``; yet, it's its own ``Command``, and the ``save`` function receives as its first invocation argument an instance of the ``Command`` class ``save``. Moreover, in this case, ``save`` gains nothing from this self-reference; its class defines no special attributes or functionality of its own beyond argument-parsing.
To improve on the above, we may instead decorate our command function with ``cmdmethod``::
@Git.register
class Stash(Command):
"""stash the changes in a dirty working directory away"""
def perform_save(self, interactive=False):
print("stash save", f"(interactive: {interactive})")
def __call__(self):
self.perform_save()
@cmdmethod('-p', '--patch', dest='interactive', action='store_true', default=False,
help="interactively select hunks from the diff between HEAD "
"and the working tree to be stashed")
def save(self, args):
"""save your local modifications to a new stash"""
self.perform_save(args.interactive)
The ``cmdmethod`` decorator – as well as the complementary ``localmethod`` decorator – alter the binding of the decorated function such that it receives the instance of its parent command – not itself – upon invocation. Much cleaner.
As with the ``local`` decorator, ``cmdmethod`` is merely a wrapper of ``cmd``. Identical functionality can be achieved via the ``binding`` keyword, though far more verbosely::
from argcmdr import CommandDecorator
@cmd(binding=CommandDecorator.Binding.parent)
def save(self, args):
...
Walking the hierachy
~~~~~~~~~~~~~~~~~~~~
Unlike the base command ``git`` in the example above, the command ``git stash`` – despite defining its own subcommands – also defines its own functionality, via ``__call__``. This functionality, however, is merely a shortcut to the ``stash`` command ``save``. Rather than repeat the definition of this functionality, ``Stash`` "walks" its hierarchy to access the instantiation of ``Save``, and invokes this command by reference.
Much of ``argcmdr`` is defined at the class level, and as such many ``Command`` methods are ``classmethod``. In the static or class context, we might walk the command hierarchy by reference, *e.g.* to ``Stash.Save``; or, from a class method of ``Stash``, as ``cls.Save``. Moreover, ``Command`` defines the class-level "property" ``subcommands``, which returns a list of ``Command`` classes immediately "under" it in the hierarchy.
The hierarchy of executable command objects, however, is instantiated at runtime and cached within the ``Command`` instance. To facilitate navigation of this hierarchy, the ``Command`` object is itself subscriptable. Look-up keys may be:
* strings – descend the hierarchy to the named command
* negative integers – ascend the hierarchy this many levels
* a sequence combining the above – to combine "steps" into a single action
In the above example, ``Stash`` may have (redundantly) accessed ``Save`` with the look-up key::
(-1, 'stash', 'save')
that is with the full expression::
self[-1, 'stash', 'save']
(The single key ``'save'``, however, was far more to the point.)
Because command look-ups are relative to the current command, ``Command`` also offers the ``property`` ``root``, which returns the base command. As such, our redundant expression could be rewritten::
self.root['stash', 'save']
Finally, a command instance's immediate subcommands may be iterated by iteration of the command, *e.g.*::
def __call__(self):
for subcommand in self:
subcommand.delegate()
Command delegation
~~~~~~~~~~~~~~~~~~
As you've seen above, command instance subscription enables access to ancestor and descendent commands from the command hierarchy.
And, simple ``Command`` instances may be executed directly via ``__call__``. However, above, we instead invoked the ``delegate`` method. Why?
* ``__call__`` must be invoked as defined – including its argument signature – which may or may not include ``args`` and/or ``parser`` (and which may change during development)
* The ``args`` and ``parser`` in the scope of the delegating command – (generally the command actually selected by user argumentation) – reflect the arguments defined for that command, *not* those of the delegated command.
For ``Local`` command instances, the situation, without ``delegate``, is worse:
* To generate system commands (rather than executing them immediately), we must know to target ``prepare`` rather than ``__call__``
For example, above, our ``Stash`` command might look like the following without ``delegate``::
class Stash(Command):
"""stash the changes in a dirty working directory away"""
def __call__(self, args):
self['save'](args)
class Save(Command):
"""save your local modifications to a new stash"""
def __init__(self, parser):
parser.add_argument(
'-p', '--patch',
dest='interactive',
action='store_true',
default=False,
help="interactively select hunks from the diff between HEAD "
"and the working tree to be stashed",
)
def __call__(self, args):
interactive = getattr(args, 'interactive', False)
print("stash save", f"(interactive: {interactive})")
Note, in ``Stash.__call__``, the passing through of ``args``; and, in ``Stash.Save.__call__``, the use of ``getattr``. With ``delegate``, neither is required.
The call() method
+++++++++++++++++
You'll *also* find that there's the command method ``call`` (without underscores)!
This is a shorcut for ``delegate('__call__', …)``: *i.e.* it will only delegate to the bound command by invoking its ``__call__`` method, (even if it's a ``Local`` command defining ``prepare``).
Whereas ``delegate`` is useful for switching between commands via their default invocation methods (either ``__call__`` or ``prepare``), and for switching between execution methods of a single command, ``call`` is useful for ensuring that the bound command will be executed – *i.e.* that its ``__call__`` method will be invoked – regardless of its type. This is important to argcmdr itself (in ``argcmdr.main``), and useful for command delegation across disparate base classes.
The management file
-------------------
In addition to the interface of custom executables, ``argcmdr`` endeavors to improve the generation and maintainability of non-executable but standardized files, intended for management of code development projects and operations.
Similar to a project's ``Makefile``, we might define our previous codebase-management file as the following Python module, ``manage.py``::
import os
from argcmdr import Local, main
class Management(Local):
"""manage deployment"""
def __init__(self, parser):
parser.add_argument(
'-e', '--env',
choices=('development', 'production'),
default='development',
help="target environment",
)
class Build(Local):
"""build app"""
def prepare(self, args):
req_path = os.path.join('requirements', f'{args.env}.txt')
yield self.local['pip']['-r', req_path]
class Deploy(Local):
"""deploy app"""
def prepare(self, args):
yield self.local['eb']['deploy', args.env]
Unlike our original script, ``manage``, ``manage.py`` is not executable, and need define neither an initial shebang line nor a final ``__name__ == '__main__'`` block.
Rather, ``argcmdr`` supplies its own, general-purpose ``manage`` executable command, which loads Commands from any ``manage.py`` in the current directory, or as specified by option ``--manage-file PATH``. As such, the usage and functionality of our ``manage.py``, as invoked via argcmdr's installed ``manage`` command, is identical to our original ``manage``. We need only ensure that ``argcmdr`` is installed, in order to make use of it to manage any or all project tasks, in a standard way, with even less boilerplate.
Entrypoint definition
~~~~~~~~~~~~~~~~~~~~~
In lieu of an explicitly defined execution path, ``manage`` infers the base command – and hence the entrypoint – of the ``manage.py`` management file module.
The entrypoint of a management file defining – at the module level – only one ``Command``, or multiple commands but only one ``RootCommand``, is assumed to be this one command. Otherwise, the intended entrypoint must be decorated with ``@entrypoint``::
from argcmdr import entrypoint, RootCommand
class GoodCommand(RootCommand):
def good_function(self):
...
@entrypoint
class CommandEhh(GoodCommand):
def __call__(self):
self.good_function()
...
@CommandEhh.register
class CommandBeh(GoodCommand):
def __call__(self):
self.good_function()
...
We may infer from the above that ``GoodCommand`` is merely a base class extension, and that the module's CLI begins with the most "root" command, ``CommandEhh``, which is extended by ``CommandBeh``. However, rather than go out on a limb, when presented with these three subclasses of ``Command`` and ``RootCommand``, ``argcmdr`` requires that the intended entrypoint is explicitly marked.
Note, however, that only commands declared at the module, or "top" level, are considered potential entrypoints::
class CommandEhh(Command):
class CommandBeh(Command):
...
Presented with a module containing only the above commands, ``argcmdr`` would identify ``CommandEhh`` as the entrypoint; ``CommandBeh`` would never be considered, even if decorated ``@entrypoint``.
The management package
~~~~~~~~~~~~~~~~~~~~~~
Python *packages*, no less than stand-alone modules, may also be defined for use with the ``manage`` command, to aid in maintenance and development.
Consider the following example directory layout::
manage/
├── __init__.py
├── cloud.py
├── db.py
├── main.py
├── morale.py
├── server.py
└── util.py
``argcmdr`` will load the above top-level Python module, ``manage``, just as well as it would the ``manage`` module defined by a ``manage.py`` file, (whether these are available on the ``PYTHONPATH`` or not).
Furthermore, detecting that *this* ``manage`` is in fact a package, ``argcmdr`` will *automatically* and *recursively* load all of the modules this package contains.
This allows the developer to provide ``argcmdr`` the minimum that it requires at ``manage/__init__.py`` – access to an interface entrypoint, *i.e.* the base ``Command`` – and to organize the development of that interface in whatever maintainable way suits them.
To wit, the developer simply might write, in ``manage/__init__.py``::
from .main import Main # noqa
(…And they will have no need of the ``@entrypoint`` decorator, as ``argcmdr`` will only see the one top-level command.)
Of course, that top-level command might have been defined in ``__init__.py``, or as you might prefer, in ``manage/main.py``::
from argcmdr import RootCommand
class Main(RootCommand):
"""your one-stop shop for devops"""
...
And, each subcommand may be defined in a submodule, such as ``manage/cloud.py``::
from argcmdr import Command
from .main import Main
@Main.register
class Cloud(Command):
"""manage cloud computing resources"""
...
Thanks to automatic loading, the ``Cloud`` subcommand, (which will resolve to ``manage cloud``), will be picked up, without additional boilerplate and without needing to consider circular imports.
To disable automatic submodule loading, set the following in ``manage/__init__.py``::
__auto_init_package__ = False
And to make (custom) use of this feature, see: ``argcmdr.init_package()``.
Bootstrapping
~~~~~~~~~~~~~
To ensure that such a friendly – and *relatively* high-level – project requirement as ``argcmdr`` is satisfied, consider the expressly low-level utility install-cli_, with which to guide contributors through the process of provisioning your project's most basic requirements.
Shell completion
----------------
``argcmdr`` supports shell command argument completion via ``argcomplete`` (see argcomplete_).
As explained by its documentation, your user (perhaps in executing the installation of your command), may enable argument completion, either:
* specifically for your shell command
* or generally for any script containing the string **PYTHON_ARGCOMPLETE_OK** in its first 1024 bytes
For flexibility, (and, *e.g.*, in support of installation into virtual environments, or otherwise where system- or user-global installation is undesirable or impossible), ``argcmdr`` *does not* currently insist on a particular scheme to enable argument completion.
Rather, for example, to enable argument completion system-wide, specifically for the ``manage`` command (provisioned by ``argcmdr``), you might execute the following from a Bash shell (as the root user)::
register-python-argcomplete --shell bash manage > /usr/share/bash-completion/completions/python-argcomplete-manage.sh
(or depending upon your system)::
register-python-argcomplete --shell bash manage > /etc/bash_completion.d/python-argcomplete-manage.sh
Alternatively, the same argument completion may be enabled, but only for the current user::
mkdir -p ~/.local/share/bash-completion/completions/
register-python-argcomplete --shell bash manage > ~/.local/share/bash-completion/completions/python-argcomplete-manage.sh
(or as preferred)::
mkdir -p ~/.bash_completion.d
register-python-argcomplete --shell bash manage > ~/.bash_completion.d/python-argcomplete-manage.sh
Only in the latter case, the user may also required the file ``~/.bash_completion``, including contents of the following form::
if [ -d ~/.bash_completion.d/ ] && [ ! -z "$(ls ~/.bash_completion.d/)" ]; then
for bcfile in ~/.bash_completion.d/*; do
. "$bcfile"
done
fi
(such that Bash will load the completion file automatically).
In the case that neither system-wide nor user-only installation is appropriate, the same argument completion may be enabled, but only for the current shell::
eval "$(register-python-argcomplete --shell bash manage)"
Regardless of the method, having so enabled argument completion (for your command), in your shell, ``argcmdr`` will handle the rest, generating completion suggestions based on your command definition.
.. _argparse: https://docs.python.org/3/library/argparse.html
.. _python.org: https://www.python.org/downloads/
.. _Homebrew: https://brew.sh/
.. _pyenv: https://github.com/pyenv/pyenv
.. _pyenv installer: https://github.com/pyenv/pyenv-installer#installation--update--uninstallation
.. _plumbum: https://plumbum.readthedocs.io/en/latest/local_commands.html#exit-codes
.. _Dickens: https://github.com/dssg/dickens
.. _install-cli: https://github.com/dssg/install-cli
.. _argcomplete: https://argcomplete.readthedocs.io/
Raw data
{
"_id": null,
"home_page": "https://github.com/dssg/argcmdr",
"name": "argcmdr",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": "",
"keywords": "",
"author": "Center for Data Science and Public Policy",
"author_email": "datascifellows@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/a0/0d/8519feb791101b75498a31f6010a678c736cf6e39a09e5bd99558899c167/argcmdr-1.1.0.tar.gz",
"platform": null,
"description": "=======\nargcmdr\n=======\n\nThe thin ``argparse`` wrapper for quick, clear and easy declaration of (hierarchical) console command interfaces via Python.\n\n``argcmdr``:\n\n* handles the boilerplate of CLI\n\n * while maintaining the clarity and extensibility of your code\n * without requiring you to learn Yet Another argument-definition syntax\n * without reinventing the wheel or sacrificing the flexibility of ``argparse``\n\n* enables invocation via\n\n * executable script (``__name__ == '__main__'``)\n * ``setuptools`` entrypoint\n * command-defining module (like the ``Makefile`` of ``make``)\n\n* determines command hierarchy flexibly and cleanly\n\n * command declarations are nested to indicate CLI hierarchy *or*\n * commands are decorated to indicate their hierarchy\n\n* includes support for elegant interaction with the operating system, via ``plumbum``\n\nSetup\n=====\n\n``argcmdr`` is developed for Python version 3.6.3 and above, and may be built via ``setuptools``.\n\nPython\n------\n\nIf Python 3.6.3 or greater is not installed on your system, it is available from python.org_.\n\nHowever, depending on your system, you might prefer to install Python via a package manager, such as Homebrew_ on Mac OS X or APT on Debian-based Linux systems.\n\nAlternatively, pyenv_ is highly recommended to manage arbitrary installations of Python, and may be most easily installed via the `pyenv installer`_.\n\nargcmdr\n-------\n\nTo install from PyPI::\n\n pip install argcmdr\n\nTo install from Github::\n\n pip install git+https://github.com/dssg/argcmdr.git\n\nTo install from source::\n\n python setup.py install\n\nTutorial\n========\n\n.. contents::\n :local:\n\nThe Command\n-----------\n\n``argcmdr`` is built around the base class ``Command``. Your console command extends ``Command``, and optionally defines:\n\n* ``__init__(parser)``, which adds to the parser the arguments that your command requires, as supported by ``argparse`` (see argparse_)\n* ``__call__([args, parser, ...])``, which is invoked when your console command is invoked, and which is expected to implement your command's functionality\n\nFor example, let's define the executable file ``listdir``, a trivial script which prints the current directory's contents::\n\n #!/usr/bin/env python\n\n import os\n\n from argcmdr import Command, main\n\n class Main(Command):\n \"\"\"print the current directory's contents\"\"\"\n\n def __call__(self):\n print(*os.listdir())\n\n if __name__ == '__main__':\n main(Main)\n\nShould we execute this script, it will perform much the same as ``ls -A``.\n\nLet's say, however, that we would like to optionally print each item of the directory's contents on a separate line::\n\n class Main(Command):\n \"\"\"print the current directory's contents\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-1',\n action='store_const',\n const='\\n',\n default=' ',\n dest='sep',\n help='list one file per line',\n )\n\n def __call__(self, args):\n print(*os.listdir(), sep=args.sep)\n\nWe now optionally support execution similar to ``ls -A1``, via ``listdir -1``.\n\nFittingly, this is reflected in the script's autogenerated usage text \u2013 ``listdir -h`` prints::\n\n usage: listdir [-h] [--tb] [-1]\n\n print the current directory's contents\n\n optional arguments:\n -h, --help show this help message and exit\n --tb, --traceback print error tracebacks\n -1 list one file per line\n\nThe command decorator\n---------------------\n\nFor particularly trivial commands, the class declaration syntax may be considered verbose and unnecessary. The ``@cmd`` decorator manufactures the appropriate ``Command`` from a decorated function or method.\n\nThe first command may be rewritten to produce an identical result::\n\n from argcmdr import cmd\n\n @cmd\n def main():\n \"\"\"print the current directory's contents\"\"\"\n print(*os.listdir())\n\nand, for the second, ``cmd`` optionally accepts an ``argparse`` argument definition::\n\n @cmd('-1', action='store_const', const='\\n', default=' ', dest='sep', help='list one file per line')\n def main(args):\n \"\"\"print the current directory's contents\"\"\"\n print(*os.listdir(), sep=args.sep)\n\nFurther arguments may be added via additional decoration::\n\n @cmd('-a', ...)\n @cmd('-1', ...)\n def main(args):\n ...\n\nLocal execution\n---------------\n\nAs much as we gain from Python and its standard library, it's quite typical to need to spawn non-Python subprocesses, and for that matter for your script's purpose to be entirely to orchestrate workflows built from operating system commands. Python's \u2013 and argcmdr's \u2013 benefit is to make this work easier, debuggable, testable and scalable.\n\nIn fact, our above, trivial example could be accomplished easily with direct execution of ``ls``::\n\n import argparse\n\n from argcmdr import Local, main\n\n class Main(Local):\n \"\"\"list directory contents\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n 'remainder',\n metavar='arguments for ls',\n nargs=argparse.REMAINDER,\n )\n\n def __call__(self, args):\n print(self.local['ls'](args.remainder))\n\n``local``, bound to the ``Local`` base class, is a dictionary which caches path look-ups for system executables.\n\nThis could, however, still be cleaner. For this reason, the ``Local`` command features a parallel invocation interface, ``prepare([args, parser, local, ...])``::\n\n class Main(Local):\n \"\"\"list directory contents\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n 'remainder',\n metavar='arguments for ls',\n nargs=argparse.REMAINDER,\n )\n\n def prepare(self, args):\n return self.local['ls'][args.remainder]\n\nVia the ``prepare`` interface, standard output is printed by default, and your command logic may be tested in a \"dry run,\" as reflected in the usage output of the above::\n\n usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ...\n\n list directory contents\n\n positional arguments:\n arguments for ls\n\n optional arguments:\n -h, --help show this help message and exit\n --tb, --traceback print error tracebacks\n -q, --quiet do not print command output\n -d, --dry-run do not execute commands, but print what they are (unless\n --no-show is provided)\n -s, --show print command expressions (by default not printed unless\n dry-run)\n --no-show do not print command expressions (by default not printed\n unless dry-run)\n\nTo execute multiple local subprocesses, ``prepare`` may either return an iterable (*e.g.* ``list``) of the above ``plumbum`` bound commands, or ``prepare`` may be defined as a generator function, (*i.e.* make repeated use of ``yield`` \u2013 see below).\n\nManaging execution\n~~~~~~~~~~~~~~~~~~\n\nHandling exit codes\n+++++++++++++++++++\n\nSubprocess commands emitted by ``Local.prepare`` are executed in order and, by default, failed execution is interrupted by a raised exception::\n\n class Release(Local):\n \"\"\"release the package to pypi\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n 'part',\n choices=('major', 'minor', 'patch'),\n help=\"part of the version to be bumped\",\n )\n\n def prepare(self, args):\n yield self.local['bumpversion'][args.part]\n yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']\n yield self.local['twine']['upload', 'dist/*']\n\nShould the ``bumpversion`` command fail, the ``deploy`` command will not proceed.\n\nIn some cases, however, we might like to disable this functionality, and proceed regardless of a subprocess's exit code. We may pass arguments such as ``retcode`` to ``plumbum`` by setting this attribute on the ``prepare`` method::\n\n def prepare(self, args):\n yield self.local['bumpversion'][args.part]\n yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']\n yield self.local['twine']['upload', 'dist/*']\n\n prepare.retcode = None\n\nSubprocess commands emitted by the above method will not raise execution exceptions, regardless of their exit code. (To allow only certain exit code(s), set ``retcode`` as appropriate \u2013 see plumbum_.)\n\nInspecting results\n++++++++++++++++++\n\nHaving disabled execution exceptions \u2013 and regardless \u2013 we might need to inspect a subprocess command's exit code, standard output or standard error. As such, (whether we manipulate ``retcode`` or not), ``argcmdr`` communicates these command results with ``prepare`` generator methods::\n\n def prepare(self, args):\n (code, out, err) = yield self.local['bumpversion']['--list', args.part]\n\n yield self.local['python']['setup.py', 'sdist', 'bdist_wheel']\n\n if out is None:\n version = 'DRY-RUN'\n else:\n (version_match,) = re.finditer(\n r'^new_version=([\\d.]+)$',\n out,\n re.M,\n )\n version = version_match.group(1)\n\n yield self.local['twine']['upload', f'dist/*{version}*']\n\nIn the above, ``prepare`` stores the results of ``bumpversion`` execution, in order to determine from its standard output the version to be released.\n\nHandling exceptions\n+++++++++++++++++++\n\nMoreover, we might like to define special handling for execution errors; and, perhaps rather than manipulate ``retcode`` for all commands emitted by our method, we might like to handle them separately. As such, execution exceptions are also communicated back to ``prepare`` generators::\n\n def prepare(self, args):\n try:\n (_code, out, _err) = yield self.local['bumpversion']['--list', args.part]\n except self.local.ProcessExecutionError:\n print(\"execution failed but here's a joke ...\")\n ...\n\nModifying execution\n+++++++++++++++++++\n\nCommands are run in the foreground by default, their outputs printed, as well as recorded for inspection, via the ``plumbum`` modifier, ``TEE``.\n\nTo execute a command in the background (and continue), we may specify the ``BG`` modifier::\n\n def prepare(self, args):\n future = yield (self.local.BG, self.local['bumpversion']['--list', args.part])\n\nAlternatively, we may wish to execute a command in the foreground *only*, (and not record its output) \u2013 *e.g.* to best support processes which require TTY::\n\n def prepare(self):\n return (self.local.FG, self.local['ipython']['-i', 'startup.py'])\n\nThe local decorator\n~~~~~~~~~~~~~~~~~~~\n\n``Local`` is an alternate command base class, and a subclass of ``Command``. Any base class may be substituted for ``Command`` when using the command decorator::\n\n @cmd(base=CustomCommand)\n def main():\n ...\n\nMoreover, ``Local`` functionality may be requested via keyword flag ``local``::\n\n @cmd(local=True)\n def main(self):\n ...\n\nAnd in support of the above, common case, the ``@local`` decorator is provided::\n\n from argcmdr import local\n\n @local\n def main(self):\n ...\n\nNote that in the last two examples, our command function's call signature included ``self``.\n\nDecorated command functions are in fact replaced with manufactured subclasses of ``Command``, and the function is invoked as this command's functionality \u2013 either ``__call__`` or ``prepare``. It is assumed that, by default, this function should be treated as a ``staticmethod``, and given no reference to the manufactured ``Command`` instance. However, in the case of ``local`` decoration, this is not the case; the binding is left up to the decorated object, which, according to Python descriptor rules, means that a decorated function is treated as a \"method\" and receives the instance. This way, ``local`` command functions may access the instance's ``local`` dictionary of operating system executables.\n\nBinding may be explicitly controlled via the decorator keyword ``binding``, *e.g.*::\n\n @cmd(binding=True, base=CustomCommand)\n def main(self):\n ...\n\nSee `Method commands`_ for further examples of decorator-defined commands and alternative bindings.\n\nCommand invocation signature\n----------------------------\n\nNote that in our last trivial examples of listing directory contents, we made our script dependent upon the ``ls`` command in the operating environment. ``argcmdr`` will not, by default, print tracebacks, and it will colorize unhandled exceptions; however, we might prefer to print a far friendlier error message.\n\nOne easy way of printing friendly error messages is to make use of ``argparse.ArgumentParser.error()``. As we've seen, ``Command`` invocation, via either ``__call__`` or ``prepare``, may accept zero arguments, or it may require the parsed arguments ``argparse.Namespace``. Moreover, it may require a second argument to receive the argument parser, and a third argument to receive the ``local`` dictionary::\n\n class Main(Local):\n \"\"\"list directory contents\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n 'remainder',\n metavar='arguments for ls',\n nargs=argparse.REMAINDER,\n )\n\n def prepare(self, args, parser, local):\n try:\n local_exec = local['ls']\n except local.CommandNotFound:\n parser.error('command not available')\n\n yield local_exec[args.remainder]\n\nIf ``ls`` is not available, the user is presented the following message upon executing the above::\n\n usage: listdir [-h] [--tb] [-q] [-d] [-s] [--no-show] ...\n listdir: error: command not available\n\nAccess to the parsed argument namespace\n~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n\nThe command invocation's parsed arguments are most straight-forwardly accessible as the first argument of the ``Command`` invocation signature, either ``__call__`` or ``prepare``. However, in less-than-trivial implementations, wherein command methods are factored for reusability, passing the argument namespace from method to method may become tedious. To support such scenarios, this object is made additionally available via the ``Command`` *property*, ``args``.\n\nConsider a class of commands which require a database password. We don't want to store this password anywhere in plain text; rather, we expect it to be input, either via (piped) standard input or the TTY::\n\n class DbSync(Command):\n \"\"\"sync databases\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-p', '--password',\n action='store_true',\n dest='stdin_password',\n default=False,\n help=\"read database password from standard input\",\n )\n\n def __call__(self, args):\n engine = self.dbengine(args)\n ...\n\n def dbcreds(self, args):\n dbcreds = {\n 'username': os.getenv('PGUSER'),\n 'host': os.getenv('PGHOST'),\n 'port': os.getenv('PGPORT'),\n 'database': os.getenv('PGDATABASE'),\n }\n\n missing = [key for (key, value) in dbcreds.items() if not value]\n if missing:\n raise RuntimeError(\n \"database connection information missing from \"\n \"environmental configuration: \" + ', '.join(missing)\n )\n\n if args.stdin_password:\n dbcreds['password'] = sys.stdin.read().rstrip('\\n\\r')\n\n # we're done with the (pipe) stdin, so force it back to TTY for\n # any subsequent input()\n sys.stdin = open('/dev/tty')\n else:\n dbcreds['password'] = os.getenv('PGPASSWORD')\n if not dbcreds['password']:\n dbcreds['password'] = getpass.getpass(\n 'enter password for '\n + ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold)\n + ': '\n | colors.yellow\n )\n\n return dbcreds\n\n def dburi(self, args):\n return sqlalchemy.engine.url.URL('postgres', **self.dbcreds(args))\n\n def dbengine(self, args):\n return sqlalchemy.create_engine(self.dburi(args))\n\nNot only were we forced to verbosely daisy-chain the arguments namespace, ``args``, from method to method; moreover, we were prevented from (trivially) caching the result of ``dbcreds``, to ensure that the password isn't ever requested more than once.\n\nNow, let's reimplement the above, making use of the property ``args``::\n\n class DbSync(Command):\n \"\"\"sync databases\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-p', '--password',\n action='store_true',\n dest='stdin_password',\n default=False,\n help=\"read database password from standard input\",\n )\n\n def __call__(self):\n engine = self.dbengine\n ...\n\n @cachedproperty\n def dbcreds(self):\n dbcreds = {\n 'username': os.getenv('PGUSER'),\n 'host': os.getenv('PGHOST'),\n 'port': os.getenv('PGPORT'),\n 'database': os.getenv('PGDATABASE'),\n }\n\n missing = [key for (key, value) in dbcreds.items() if not value]\n if missing:\n raise RuntimeError(\n \"database connection information missing from \"\n \"environmental configuration: \" + ', '.join(missing)\n )\n\n if self.args.stdin_password:\n dbcreds['password'] = sys.stdin.read().rstrip('\\n\\r')\n\n # we're done with the (pipe) stdin, so force it back to TTY for\n # any subsequent input()\n sys.stdin = open('/dev/tty')\n else:\n dbcreds['password'] = os.getenv('PGPASSWORD')\n if not dbcreds['password']:\n dbcreds['password'] = getpass.getpass(\n 'enter password for '\n + ('{username}@{host}:{port}'.format_map(dbcreds) | colors.bold)\n + ': '\n | colors.yellow\n )\n\n return dbcreds\n\n @property\n def dburi(self):\n return sqlalchemy.engine.url.URL('postgres', **self.dbcreds)\n\n @property\n def dbengine(self):\n return sqlalchemy.create_engine(self.dburi)\n\nIn this form, ``args`` needn't be passed from method to method; in fact, methods of the ``DbSync`` command needn't worry about arguments which don't directly interest them at all. And, using ``cachedproperty`` from Dickens_, the database credentials are trivially cached, ensuring they aren't needlessly re-requested.\n\nNote that attempting to access the ``args`` property before invocation arguments have been parsed \u2013 *e.g.* within ``__init__`` \u2013 is not allowed, and will raise ``RuntimeError``.\n\nAccess to the parser\n~~~~~~~~~~~~~~~~~~~~\n\nIn addition to ``args``, the ``parser`` associated with the command may alternatively be retrieved via its ``parser`` property.\n\nSimilar to ``args``, the ``parser`` is not available until the command has been initialized; however, this property *may* be used *within* ``__init__``, so long as the base ``__init__`` has been invoked (*e.g.* via ``super().__init__``).\n\nCommand hierarchy\n-----------------\n\nOur tools should be modular and composable, favoring atomicity over monolithism. Nevertheless, well-designed, -structured and -annotated code and application interfaces pay their users and developers tremendous dividends over time \u2013 no less in the case of more extensive interfaces, and particularly so for project management libraries (consider the ``Makefile``).\n\n``argcmdr`` intends to facilitate the definition of ``argparse``-based interfaces regardless of their structure. But it's in multi-level, or hierarchical, command argumentation that ``argcmdr`` shines.\n\nNested commands\n~~~~~~~~~~~~~~~\n\nRather than procedurally defining subparsers, ``Command`` class declarations may simply be nested.\n\nLet's define an executable file ``manage`` for managing a codebase::\n\n #!/usr/bin/env python\n\n import os\n\n from argcmdr import Local, main\n\n class Management(Local):\n \"\"\"manage deployment\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-e', '--env',\n choices=('development', 'production'),\n default='development',\n help=\"target environment\",\n )\n\n class Build(Local):\n \"\"\"build app\"\"\"\n\n def prepare(self, args):\n req_path = os.path.join('requirements', f'{args.env}.txt')\n yield self.local['pip']['-r', req_path]\n\n class Deploy(Local):\n \"\"\"deploy app\"\"\"\n\n def prepare(self, args):\n yield self.local['eb']['deploy', args.env]\n\n if __name__ == '__main__':\n main(Management)\n\n``Local`` command ``Management``, above, defines no functionality of its own. As such, executing ``manage`` without arguments prints its autogenerated usage::\n\n usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show]\n [-e {development,production}]\n {build,deploy} ...\n\nBecause ``Management`` extends ``Local``, it inherits argumentation controlling whether standard output is printed and offering to run commands in \"dry\" mode. (Note, however, that it could have omitted these options by extending ``Command``. Moreover, it may override class method ``base_parser()``.)\n\n``Management`` adds to the basic interface the optional argument ``--env``. Most important, however, are the related, nested commands ``Build`` and ``Deploy``, which define functionality via ``prepare``. Neither nested command extends its subparser \u2013 though they could; but rather, they depend upon the common argumentation defined by ``Management``.\n\nExploring the interface via ``--help`` tells us a great deal, for example ``manage -h``::\n\n usage: manage [-h] [--tb] [-q] [-d] [-s] [--no-show]\n [-e {development,production}]\n {build,deploy} ...\n\n manage deployment\n\n optional arguments:\n -h, --help show this help message and exit\n --tb, --traceback print error tracebacks\n -q, --quiet do not print command output\n -d, --dry-run do not execute commands, but print what they are\n (unless --no-show is provided)\n -s, --show print command expressions (by default not printed\n unless dry-run)\n --no-show do not print command expressions (by default not\n printed unless dry-run)\n -e {development,production}, --env {development,production}\n target environment\n\n management commands:\n {build,deploy} available commands\n build build app\n deploy deploy app\n\nAnd ``manage deploy -h``::\n\n usage: manage deploy [-h]\n\n deploy app\n\n optional arguments:\n -h, --help show this help message and exit\n\nAs such, a \"dry run\"::\n\n manage -de production deploy\n\nprints the following::\n\n > /home/user/.local/bin/eb deploy production\n\nand without the dry-run flag the above operating system command is executed.\n\nRoot commands\n~~~~~~~~~~~~~\n\nThere is no artificial limit to the number of levels you may add to your command hierarchy. However, application interfaces are commonly \"wider\" than they are \"deep\". For this reason, as an alternative to class-nesting, the hierarchical relationship may be defined by a class decorator provided by the ``RootCommand``.\n\nLet's define the executable file ``git`` with no particular purpose whatsoever::\n\n #!/usr/bin/env python\n\n from argcmdr import Command, RootCommand, main\n\n class Git(RootCommand):\n \"\"\"another stupid content tracker\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-C',\n default='.',\n dest='path',\n help=\"run as if git was started in <path> instead of the current \"\n \"working directory.\",\n )\n\n @Git.register\n class Stash(Command):\n \"\"\"stash the changes in a dirty working directory away\"\"\"\n\n def __call__(self, args):\n self['save'].delegate()\n\n class Save(Command):\n \"\"\"save your local modifications to a new stash\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-p', '--patch',\n dest='interactive',\n action='store_true',\n default=False,\n help=\"interactively select hunks from the diff between HEAD \"\n \"and the working tree to be stashed\",\n )\n\n def __call__(self, args):\n print(\"stash save\", f\"(interactive: {args.interactive})\")\n\n class List(Command):\n \"\"\"list the stashes that you currently have\"\"\"\n\n def __call__(self):\n print(\"stash list\")\n\n if __name__ == '__main__':\n main(Git)\n\nWe anticipate adding many subcommands to ``git`` beyond ``stash``; and so, rather than nest all of these command classes under ``Git``:\n\n* we've defined ``Git`` as a ``RootCommand``\n* we've declared ``Stash`` at the module root\n* we've decorated ``Stash`` with ``Git.register``\n\nThe ``RootCommand`` functions identically to the ``Command``; it only adds this ability to extend the listing of its subcommands by those registered via its decorator. (Notably, ``LocalRoot`` composes the functionaliy of ``Local`` and ``RootCommand`` via multiple inheritance.)\n\nThe ``stash`` command, on the other hand, has opted to contain the entirety of its hierarchical functionality, nesting its own subcommands ``list`` and ``save``.\n\nNevertheless, you are not limited to a single ``RootCommand``. Any command whose hierarchy you would like to extend via the ``register`` decorator may inherit it. Moreover, the ``@cmd`` decorator accepts the keyword flag ``root``.\n\nMethod commands\n~~~~~~~~~~~~~~~\n\nDecorator-manufactured commands are no less capable than those derived from class declaration syntax, *except* in that other commands cannot, syntactically, be nested beneath them. (For that reason the ``@cmd`` decorator's ``root`` flag is of note.) Decorator-manufactured commands can nonetheless themselves extend hierarchies, either by being further decorated by ``register`` or nested under command class declarations::\n\n @Git.register\n class Stash(Command):\n \"\"\"stash the changes in a dirty working directory away\"\"\"\n\n def __call__(self, args):\n self['save'].delegate()\n\n @cmd('-p', '--patch', dest='interactive', action='store_true', default=False,\n help=\"interactively select hunks from the diff between HEAD \"\n \"and the working tree to be stashed\")\n def save(args):\n \"\"\"save your local modifications to a new stash\"\"\"\n print(\"stash save\", f\"(interactive: {args.interactive})\")\n\n @cmd\n def list():\n \"\"\"list the stashes that you currently have\"\"\"\n print(\"stash list\")\n\nAbove we've rewritten the trivial ``stash`` commands ``save`` and ``list`` as ``@cmd``-decorated functions.\n\nSay, however, that we needed to invert the factoring of ``save`` logic between that command and its parent::\n\n @Git.register\n class Stash(Command):\n \"\"\"stash the changes in a dirty working directory away\"\"\"\n\n def perform_save(self, interactive=False):\n print(\"stash save\", f\"(interactive: {interactive})\")\n\n def __call__(self):\n self.perform_save()\n\n @cmd('-p', '--patch', dest='interactive', action='store_true', default=False,\n help=\"interactively select hunks from the diff between HEAD \"\n \"and the working tree to be stashed\")\n @cmd(binding=True)\n def save(self, args):\n \"\"\"save your local modifications to a new stash\"\"\"\n self[-1].perform_save(args.interactive)\n\n @cmd\n def list():\n \"\"\"list the stashes that you currently have\"\"\"\n print(\"stash list\")\n\n(Note that ``cmd`` can accept both an ``argparse`` argument specification and command feature-defining arguments at once; however, this is of use mainly to the definition of helpers such as the ``local`` decorator, as this style is difficult to read and otherwise discouraged. Moreover, only the **first** \u2013 *i.e.* inner-most \u2013 ``cmd`` decorator's command features are respected.)\n\nIn this version, ``save`` functionality is shared as a method of ``Stash``. ``save`` is able to access this method only by ascending the command hierarchy. This might make particular sense when multiple nested commands must share functionality, which is defined on the command class under which they are nested. (Note, however, that in such a case as this one, where the shared method *could* be defined as a ``staticmethod``, it is no less advisable to do so, and for nested commands to access it directly as, *e.g.* ``Stash.perform_save``.)\n\nOur above reference to ``self`` in ``save``, however, is at first glance misleading. This command *looks* like an instance method of ``Stash``; yet, it's its own ``Command``, and the ``save`` function receives as its first invocation argument an instance of the ``Command`` class ``save``. Moreover, in this case, ``save`` gains nothing from this self-reference; its class defines no special attributes or functionality of its own beyond argument-parsing.\n\nTo improve on the above, we may instead decorate our command function with ``cmdmethod``::\n\n @Git.register\n class Stash(Command):\n \"\"\"stash the changes in a dirty working directory away\"\"\"\n\n def perform_save(self, interactive=False):\n print(\"stash save\", f\"(interactive: {interactive})\")\n\n def __call__(self):\n self.perform_save()\n\n @cmdmethod('-p', '--patch', dest='interactive', action='store_true', default=False,\n help=\"interactively select hunks from the diff between HEAD \"\n \"and the working tree to be stashed\")\n def save(self, args):\n \"\"\"save your local modifications to a new stash\"\"\"\n self.perform_save(args.interactive)\n\nThe ``cmdmethod`` decorator \u2013 as well as the complementary ``localmethod`` decorator \u2013 alter the binding of the decorated function such that it receives the instance of its parent command \u2013 not itself \u2013 upon invocation. Much cleaner.\n\nAs with the ``local`` decorator, ``cmdmethod`` is merely a wrapper of ``cmd``. Identical functionality can be achieved via the ``binding`` keyword, though far more verbosely::\n\n from argcmdr import CommandDecorator\n\n @cmd(binding=CommandDecorator.Binding.parent)\n def save(self, args):\n ...\n\nWalking the hierachy\n~~~~~~~~~~~~~~~~~~~~\n\nUnlike the base command ``git`` in the example above, the command ``git stash`` \u2013 despite defining its own subcommands \u2013 also defines its own functionality, via ``__call__``. This functionality, however, is merely a shortcut to the ``stash`` command ``save``. Rather than repeat the definition of this functionality, ``Stash`` \"walks\" its hierarchy to access the instantiation of ``Save``, and invokes this command by reference.\n\nMuch of ``argcmdr`` is defined at the class level, and as such many ``Command`` methods are ``classmethod``. In the static or class context, we might walk the command hierarchy by reference, *e.g.* to ``Stash.Save``; or, from a class method of ``Stash``, as ``cls.Save``. Moreover, ``Command`` defines the class-level \"property\" ``subcommands``, which returns a list of ``Command`` classes immediately \"under\" it in the hierarchy.\n\nThe hierarchy of executable command objects, however, is instantiated at runtime and cached within the ``Command`` instance. To facilitate navigation of this hierarchy, the ``Command`` object is itself subscriptable. Look-up keys may be:\n\n* strings \u2013 descend the hierarchy to the named command\n* negative integers \u2013 ascend the hierarchy this many levels\n* a sequence combining the above \u2013 to combine \"steps\" into a single action\n\nIn the above example, ``Stash`` may have (redundantly) accessed ``Save`` with the look-up key::\n\n (-1, 'stash', 'save')\n\nthat is with the full expression::\n\n self[-1, 'stash', 'save']\n\n(The single key ``'save'``, however, was far more to the point.)\n\nBecause command look-ups are relative to the current command, ``Command`` also offers the ``property`` ``root``, which returns the base command. As such, our redundant expression could be rewritten::\n\n self.root['stash', 'save']\n\nFinally, a command instance's immediate subcommands may be iterated by iteration of the command, *e.g.*::\n\n def __call__(self):\n for subcommand in self:\n subcommand.delegate()\n\nCommand delegation\n~~~~~~~~~~~~~~~~~~\n\nAs you've seen above, command instance subscription enables access to ancestor and descendent commands from the command hierarchy.\n\nAnd, simple ``Command`` instances may be executed directly via ``__call__``. However, above, we instead invoked the ``delegate`` method. Why?\n\n* ``__call__`` must be invoked as defined \u2013 including its argument signature \u2013 which may or may not include ``args`` and/or ``parser`` (and which may change during development)\n* The ``args`` and ``parser`` in the scope of the delegating command \u2013 (generally the command actually selected by user argumentation) \u2013 reflect the arguments defined for that command, *not* those of the delegated command.\n\nFor ``Local`` command instances, the situation, without ``delegate``, is worse:\n\n* To generate system commands (rather than executing them immediately), we must know to target ``prepare`` rather than ``__call__``\n\nFor example, above, our ``Stash`` command might look like the following without ``delegate``::\n\n class Stash(Command):\n \"\"\"stash the changes in a dirty working directory away\"\"\"\n\n def __call__(self, args):\n self['save'](args)\n\n class Save(Command):\n \"\"\"save your local modifications to a new stash\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-p', '--patch',\n dest='interactive',\n action='store_true',\n default=False,\n help=\"interactively select hunks from the diff between HEAD \"\n \"and the working tree to be stashed\",\n )\n\n def __call__(self, args):\n interactive = getattr(args, 'interactive', False)\n print(\"stash save\", f\"(interactive: {interactive})\")\n\nNote, in ``Stash.__call__``, the passing through of ``args``; and, in ``Stash.Save.__call__``, the use of ``getattr``. With ``delegate``, neither is required.\n\nThe call() method\n+++++++++++++++++\n\nYou'll *also* find that there's the command method ``call`` (without underscores)!\n\nThis is a shorcut for ``delegate('__call__', \u2026)``: *i.e.* it will only delegate to the bound command by invoking its ``__call__`` method, (even if it's a ``Local`` command defining ``prepare``).\n\nWhereas ``delegate`` is useful for switching between commands via their default invocation methods (either ``__call__`` or ``prepare``), and for switching between execution methods of a single command, ``call`` is useful for ensuring that the bound command will be executed \u2013 *i.e.* that its ``__call__`` method will be invoked \u2013 regardless of its type. This is important to argcmdr itself (in ``argcmdr.main``), and useful for command delegation across disparate base classes.\n\nThe management file\n-------------------\n\nIn addition to the interface of custom executables, ``argcmdr`` endeavors to improve the generation and maintainability of non-executable but standardized files, intended for management of code development projects and operations.\n\nSimilar to a project's ``Makefile``, we might define our previous codebase-management file as the following Python module, ``manage.py``::\n\n import os\n\n from argcmdr import Local, main\n\n class Management(Local):\n \"\"\"manage deployment\"\"\"\n\n def __init__(self, parser):\n parser.add_argument(\n '-e', '--env',\n choices=('development', 'production'),\n default='development',\n help=\"target environment\",\n )\n\n class Build(Local):\n \"\"\"build app\"\"\"\n\n def prepare(self, args):\n req_path = os.path.join('requirements', f'{args.env}.txt')\n yield self.local['pip']['-r', req_path]\n\n class Deploy(Local):\n \"\"\"deploy app\"\"\"\n\n def prepare(self, args):\n yield self.local['eb']['deploy', args.env]\n\nUnlike our original script, ``manage``, ``manage.py`` is not executable, and need define neither an initial shebang line nor a final ``__name__ == '__main__'`` block.\n\nRather, ``argcmdr`` supplies its own, general-purpose ``manage`` executable command, which loads Commands from any ``manage.py`` in the current directory, or as specified by option ``--manage-file PATH``. As such, the usage and functionality of our ``manage.py``, as invoked via argcmdr's installed ``manage`` command, is identical to our original ``manage``. We need only ensure that ``argcmdr`` is installed, in order to make use of it to manage any or all project tasks, in a standard way, with even less boilerplate.\n\nEntrypoint definition\n~~~~~~~~~~~~~~~~~~~~~\n\nIn lieu of an explicitly defined execution path, ``manage`` infers the base command \u2013 and hence the entrypoint \u2013 of the ``manage.py`` management file module.\n\nThe entrypoint of a management file defining \u2013 at the module level \u2013 only one ``Command``, or multiple commands but only one ``RootCommand``, is assumed to be this one command. Otherwise, the intended entrypoint must be decorated with ``@entrypoint``::\n\n from argcmdr import entrypoint, RootCommand\n\n class GoodCommand(RootCommand):\n\n def good_function(self):\n ...\n\n @entrypoint\n class CommandEhh(GoodCommand):\n\n def __call__(self):\n self.good_function()\n ...\n\n @CommandEhh.register\n class CommandBeh(GoodCommand):\n\n def __call__(self):\n self.good_function()\n ...\n\nWe may infer from the above that ``GoodCommand`` is merely a base class extension, and that the module's CLI begins with the most \"root\" command, ``CommandEhh``, which is extended by ``CommandBeh``. However, rather than go out on a limb, when presented with these three subclasses of ``Command`` and ``RootCommand``, ``argcmdr`` requires that the intended entrypoint is explicitly marked.\n\nNote, however, that only commands declared at the module, or \"top\" level, are considered potential entrypoints::\n\n class CommandEhh(Command):\n\n class CommandBeh(Command):\n\n ...\n\nPresented with a module containing only the above commands, ``argcmdr`` would identify ``CommandEhh`` as the entrypoint; ``CommandBeh`` would never be considered, even if decorated ``@entrypoint``.\n\nThe management package\n~~~~~~~~~~~~~~~~~~~~~~\n\nPython *packages*, no less than stand-alone modules, may also be defined for use with the ``manage`` command, to aid in maintenance and development.\n\nConsider the following example directory layout::\n\n manage/\n \u251c\u2500\u2500 __init__.py\n \u251c\u2500\u2500 cloud.py\n \u251c\u2500\u2500 db.py\n \u251c\u2500\u2500 main.py\n \u251c\u2500\u2500 morale.py\n \u251c\u2500\u2500 server.py\n \u2514\u2500\u2500 util.py\n\n``argcmdr`` will load the above top-level Python module, ``manage``, just as well as it would the ``manage`` module defined by a ``manage.py`` file, (whether these are available on the ``PYTHONPATH`` or not).\n\nFurthermore, detecting that *this* ``manage`` is in fact a package, ``argcmdr`` will *automatically* and *recursively* load all of the modules this package contains.\n\nThis allows the developer to provide ``argcmdr`` the minimum that it requires at ``manage/__init__.py`` \u2013 access to an interface entrypoint, *i.e.* the base ``Command`` \u2013 and to organize the development of that interface in whatever maintainable way suits them.\n\nTo wit, the developer simply might write, in ``manage/__init__.py``::\n\n from .main import Main # noqa\n\n(\u2026And they will have no need of the ``@entrypoint`` decorator, as ``argcmdr`` will only see the one top-level command.)\n\nOf course, that top-level command might have been defined in ``__init__.py``, or as you might prefer, in ``manage/main.py``::\n\n from argcmdr import RootCommand\n\n class Main(RootCommand):\n \"\"\"your one-stop shop for devops\"\"\"\n\n ...\n\nAnd, each subcommand may be defined in a submodule, such as ``manage/cloud.py``::\n\n from argcmdr import Command\n\n from .main import Main\n\n @Main.register\n class Cloud(Command):\n \"\"\"manage cloud computing resources\"\"\"\n\n ...\n\nThanks to automatic loading, the ``Cloud`` subcommand, (which will resolve to ``manage cloud``), will be picked up, without additional boilerplate and without needing to consider circular imports.\n\nTo disable automatic submodule loading, set the following in ``manage/__init__.py``::\n\n __auto_init_package__ = False\n\nAnd to make (custom) use of this feature, see: ``argcmdr.init_package()``.\n\nBootstrapping\n~~~~~~~~~~~~~\n\nTo ensure that such a friendly \u2013 and *relatively* high-level \u2013 project requirement as ``argcmdr`` is satisfied, consider the expressly low-level utility install-cli_, with which to guide contributors through the process of provisioning your project's most basic requirements.\n\nShell completion\n----------------\n\n``argcmdr`` supports shell command argument completion via ``argcomplete`` (see argcomplete_).\n\nAs explained by its documentation, your user (perhaps in executing the installation of your command), may enable argument completion, either:\n\n* specifically for your shell command\n* or generally for any script containing the string **PYTHON_ARGCOMPLETE_OK** in its first 1024 bytes\n\nFor flexibility, (and, *e.g.*, in support of installation into virtual environments, or otherwise where system- or user-global installation is undesirable or impossible), ``argcmdr`` *does not* currently insist on a particular scheme to enable argument completion.\n\nRather, for example, to enable argument completion system-wide, specifically for the ``manage`` command (provisioned by ``argcmdr``), you might execute the following from a Bash shell (as the root user)::\n\n register-python-argcomplete --shell bash manage > /usr/share/bash-completion/completions/python-argcomplete-manage.sh\n\n(or depending upon your system)::\n\n register-python-argcomplete --shell bash manage > /etc/bash_completion.d/python-argcomplete-manage.sh\n\nAlternatively, the same argument completion may be enabled, but only for the current user::\n\n mkdir -p ~/.local/share/bash-completion/completions/\n register-python-argcomplete --shell bash manage > ~/.local/share/bash-completion/completions/python-argcomplete-manage.sh\n\n(or as preferred)::\n\n mkdir -p ~/.bash_completion.d\n register-python-argcomplete --shell bash manage > ~/.bash_completion.d/python-argcomplete-manage.sh\n\nOnly in the latter case, the user may also required the file ``~/.bash_completion``, including contents of the following form::\n\n if [ -d ~/.bash_completion.d/ ] && [ ! -z \"$(ls ~/.bash_completion.d/)\" ]; then\n for bcfile in ~/.bash_completion.d/*; do\n . \"$bcfile\"\n done\n fi\n\n(such that Bash will load the completion file automatically).\n\nIn the case that neither system-wide nor user-only installation is appropriate, the same argument completion may be enabled, but only for the current shell::\n\n eval \"$(register-python-argcomplete --shell bash manage)\"\n\nRegardless of the method, having so enabled argument completion (for your command), in your shell, ``argcmdr`` will handle the rest, generating completion suggestions based on your command definition.\n\n.. _argparse: https://docs.python.org/3/library/argparse.html\n.. _python.org: https://www.python.org/downloads/\n.. _Homebrew: https://brew.sh/\n.. _pyenv: https://github.com/pyenv/pyenv\n.. _pyenv installer: https://github.com/pyenv/pyenv-installer#installation--update--uninstallation\n.. _plumbum: https://plumbum.readthedocs.io/en/latest/local_commands.html#exit-codes\n.. _Dickens: https://github.com/dssg/dickens\n.. _install-cli: https://github.com/dssg/install-cli\n.. _argcomplete: https://argcomplete.readthedocs.io/\n\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Thin argparse wrapper for quick, clear and easy declaration of hierarchical console command interfaces",
"version": "1.1.0",
"project_urls": {
"Homepage": "https://github.com/dssg/argcmdr"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5a41458b917cfa8894752fbdb27b86f8b9b78ab20c2f335f9bd27fc7446074c7",
"md5": "328c62b54d04e7bb338472ff29f3744a",
"sha256": "6b0445dc5444eeffc37c19a794d5dab90bb36ac37bb57d56dddd4834205feb13"
},
"downloads": -1,
"filename": "argcmdr-1.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "328c62b54d04e7bb338472ff29f3744a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 23397,
"upload_time": "2024-02-28T23:10:03",
"upload_time_iso_8601": "2024-02-28T23:10:03.041319Z",
"url": "https://files.pythonhosted.org/packages/5a/41/458b917cfa8894752fbdb27b86f8b9b78ab20c2f335f9bd27fc7446074c7/argcmdr-1.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a00d8519feb791101b75498a31f6010a678c736cf6e39a09e5bd99558899c167",
"md5": "298ceb4dfd98b5603de02da3a80ce145",
"sha256": "a77966378806f03fd2b9e1d3ed3995e24a1e54b761dad8e80a662e25cf11a03d"
},
"downloads": -1,
"filename": "argcmdr-1.1.0.tar.gz",
"has_sig": false,
"md5_digest": "298ceb4dfd98b5603de02da3a80ce145",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 52155,
"upload_time": "2024-02-28T23:10:06",
"upload_time_iso_8601": "2024-02-28T23:10:06.033458Z",
"url": "https://files.pythonhosted.org/packages/a0/0d/8519feb791101b75498a31f6010a678c736cf6e39a09e5bd99558899c167/argcmdr-1.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-28 23:10:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "dssg",
"github_project": "argcmdr",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"tox": true,
"lcname": "argcmdr"
}