haproxy-log-analysis


Namehaproxy-log-analysis JSON
Version 5.1.0 PyPI version JSON
download
home_pagehttps://github.com/gforcada/haproxy_log_analysis
SummaryHaproxy log analyzer that tries to gives an insight of what's going on
upload_time2022-12-03 14:02:21
maintainer
docs_urlNone
authorGil Forcada
requires_python>=3.7
licenseGPL v3
keywords haproxy log analysis report
VCS
bugtrack_url
requirements attrs coverage exceptiongroup iniconfig packaging pluggy pyparsing pytest pytest-cov tomli
Travis-CI No Travis.
coveralls test coverage
            
.. -*- coding: utf-8 -*-

HAProxy log analyzer
====================
This Python package is a `HAProxy`_ log parser.
It analyzes HAProxy log files in multiple ways (see commands section below).

.. note::
   Currently only the `HTTP log format`_ is supported.

Tests and coverage
------------------
No project is trustworthy if does not have tests and a decent coverage!

.. image:: https://github.com/gforcada/haproxy_log_analysis/actions/workflows/tests.yml/badge.svg?branch=master
   :target: https://github.com/gforcada/haproxy_log_analysis/actions/workflows/tests.yml

.. image:: https://coveralls.io/repos/github/gforcada/haproxy_log_analysis/badge.svg?branch=master
   :target: https://coveralls.io/github/gforcada/haproxy_log_analysis?branch=master


Documentation
-------------
See the `documentation and API`_ at ReadTheDocs_.

Command-line interface
----------------------
The current ``--help`` looks like this::

  usage: haproxy_log_analysis [-h] [-l LOG] [-s START] [-d DELTA] [-c COMMAND]
                              [-f FILTER] [-n] [--list-commands]
                              [--list-filters] [--json]

  Analyze HAProxy log files and outputs statistics about it

  optional arguments:
    -h, --help            show this help message and exit
    -l LOG, --log LOG     HAProxy log file to analyze
    -s START, --start START
                          Process log entries starting at this time, in HAProxy
                          date format (e.g. 11/Dec/2013 or
                          11/Dec/2013:19:31:41). At least provide the
                          day/month/year. Values not specified will use their
                          base value (e.g. 00 for hour). Use in conjunction with
                          -d to limit the number of entries to process.
    -d DELTA, --delta DELTA
                          Limit the number of entries to process. Express the
                          time delta as a number and a time unit, e.g.: 1s, 10m,
                          3h or 4d (for 1 second, 10 minutes, 3 hours or 4
                          days). Use in conjunction with -s to only analyze
                          certain time delta. If no start time is given, the
                          time on the first line will be used instead.
    -c COMMAND, --command COMMAND
                          List of commands, comma separated, to run on the log
                          file. See --list-commands to get a full list of them.
    -f FILTER, --filter FILTER
                          List of filters to apply on the log file. Passed as
                          comma separated and parameters within square brackets,
                          e.g ip[192.168.1.1],ssl,path[/some/path]. See --list-
                          filters to get a full list of them.
    -n, --negate-filter   Make filters passed with -f work the other way around,
                          i.e. if the ``ssl`` filter is passed instead of
                          showing only ssl requests it will show non-ssl
                          traffic. If the ``ip`` filter is used, then all but
                          that ip passed to the filter will be used.
    --list-commands       Lists all commands available.
    --list-filters        Lists all filters available.
    --json                Output results in json.
    --invalid             Print the lines that could not be parsed. Be aware
                          that mixing it with the print command will mix their
                          output.


Commands
--------

Commands are small purpose specific programs in themselves that report specific statistics about the log file being analyzed.
See them all with ``--list-commands`` or online at https://haproxy-log-analyzer.readthedocs.io/modules.html#module-haproxy.commands.

- ``average_response_time``
- ``average_waiting_time``
- ``connection_type``
- ``counter``
- ``http_methods``
- ``ip_counter``
- ``print``
- ``queue_peaks``
- ``request_path_counter``
- ``requests_per_hour``
- ``requests_per_minute``
- ``server_load``
- ``slow_requests``
- ``slow_requests_counter``
- ``status_codes_counter``
- ``top_ips``
- ``top_request_paths``

Filters
-------
Filters, contrary to commands,
are a way to reduce the amount of log lines to be processed.

.. note::
   The ``-n`` command line argument allows to reverse filters output.

   This helps when looking for specific traces, like a certain IP, a path...

See them all with ``--list-filters`` or online at https://haproxy-log-analyzer.readthedocs.io/modules.html#module-haproxy.filters.

- ``backend``
- ``frontend``
- ``http_method``
- ``ip``
- ``ip_range``
- ``path``
- ``response_size``
- ``server``
- ``slow_requests``
- ``ssl``
- ``status_code``
- ``status_code_family``
- ``wait_on_queues``

Installation
------------
After installation you will have a console script `haproxy_log_analysis`::

    $ pip install haproxy_log_analysis

TODO
----
- add more commands: *(help appreciated)*

  - reports on servers connection time
  - reports on termination state
  - reports around connections (active, frontend, backend, server)
  - *your ideas here*

- think of a way to show the commands output in a meaningful way

- be able to specify an output format. For any command that makes sense (slow
  requests for example) output the given fields for each log line (i.e.
  acceptance date, path, downstream server, load at that time...)

- *your ideas*

.. _HAProxy: http://haproxy.1wt.eu/
.. _HTTP log format: http://cbonte.github.io/haproxy-dconv/2.2/configuration.html#8.2.3
.. _documentation and API: https://haproxy-log-analyzer.readthedocs.io/
.. _ReadTheDocs: http://readthedocs.org

CHANGES
=======

5.1.0 (2022-12-03)
------------------

- Only get the first IP from `X-Forwarded-For` header.
  [gforcada]

- Improve tests robustness.
  [gforcada]

- Fix `top_ips` and `top_request_paths` commands output.
  They were showing all output, rather than only the top 10.
  [gforcada]

- Move `tests` folder to the top-level.
  [gforcada]

5.0.0 (2022-11-27)
------------------

- Drop testing on travis-ci.
  [gforcada]

- Use GitHub Actions.
  [gforcada]

- Format the code with `pyupgrade`, `black` and `isort`.
  [gforcada]

- Use `pip-tools` to keep dependencies locked.
  [gforcada]

- Bump python versions supported to 3.7-3.11 and pypy.
  [gforcada]

- Drop python 3.6 (EOL).
  [gforcada]

4.1.0 (2020-01-06)
------------------

- **New command:** ``requests_per_hour``.
  Just like the ``requests_per_minute`` but with hour granularity.
  Idea and first implementation done by ``valleedelisle``.
  [gforcada]

- Fix parsing truncated requests.
  Idea and first implementation by ``vixns``.
  [gforcada]

4.0.0 (2020-01-06)
------------------

**BREAKING CHANGES:**

- Complete rewrite to use almost no memory usage even on huge files.
  [gforcada]

- Add parallelization to make parsing faster by parsing multiple lines in parallel.
  [gforcada]

- Rename command ``counter_slow_requests`` to ``slow_requests_counter``,
  so it is aligned with all other ``_counter`` commands.
  [gforcada]

- Changed the ``counter_invalid`` command to a new command line switch ``--invalid``.
  [gforcada]

**Regular changes:**

- Drop Python 2 support, and test on Python 3.8.
  [gforcada]

- Remove the pickling support.
  [gforcada]

- Add `--json` output command line option.
  [valleedelisle]

3.0.0 (2019-06-10)
------------------

- Fix spelling.
  [EdwardBetts]

- Make ip_counter use client_ip per default.
  [vixns]

- Overhaul testing environment. Test on python 3.7 as well. Use black to format.
  [gforcada]

2.1 (2017-07-06)
----------------
- Enforce QA checks (flake8) on code.
  All code has been updated to follow it.
  [gforcada]

- Support Python 3.6.
  [gforcada]

- Support different syslog timestamps (at least NixOS).
  [gforcada]

2.0.2 (2016-11-17)
------------------

- Improve performance for ``cmd_print``.
  [kevinjqiu]

2.0.1 (2016-10-29)
------------------

- Allow hostnames to have a dot in it.
  [gforcada]

2.0 (2016-07-06)
----------------
- Handle unparsable HTTP requests.
  [gforcada]

- Only test on python 2.7 and 3.5
  [gforcada]

2.0b0 (2016-04-18)
------------------
- Check the divisor before doing a division to not get ``ZeroDivisionError`` exceptions.
  [gforcada]

2.0a0 (2016-03-29)
------------------
- Major refactoring:

  # Rename modules and classes:

    - haproxy_logline -> line
    - haproxy_logfile -> logfile
    - HaproxyLogLine -> Line
    - HaproxyLogFile -> Log

  # Parse the log file on Log() creation (i.e. in its __init__)

  [gforcada]

1.3 (2016-03-29)
----------------

- New filter: ``filter_wait_on_queues``.
  Get all requests that waited at maximum X amount of milliseconds on HAProxy queues.
  [gforcada]

- Code/docs cleanups and add code analysis.
  [gforcada]

- Avoid using eval.
  [gforcada]

1.2.1 (2016-02-23)
------------------

- Support -1 as a status_code
  [Christopher Baines]

1.2 (2015-12-07)
----------------

- Allow a hostname on the syslog part (not only IPs)
  [danny crasto]

1.1 (2015-04-19)
----------------

- Make syslog optional.
  Fixes issue https://github.com/gforcada/haproxy_log_analysis/issues/10.
  [gforcada]

1.0 (2015-03-24)
----------------

- Fix issue #9.
  log line on the syslog part was too strict,
  it was expecting the hostname to be a string and was
  failing if it was an IP.
  [gforcada]

0.0.3.post2 (2015-01-05)
------------------------

- Finally really fixed issue #7.
  ``namespace_packages`` was not meant to be on setup.py at all.
  Silly copy&paste mistake.
  [gforcada]

0.0.3.post (2015-01-04)
-----------------------

- Fix release on PyPI.
  Solves GitHub issue #7.
  https://github.com/gforcada/haproxy_log_analysis/issues/7
  [gforcada]

0.0.3 (2014-07-09)
------------------

- Fix release on PyPI (again).
  [gforcada]

0.0.2 (2014-07-09)
------------------

- Fix release on PyPI.
  [gforcada]

0.0.1 (2014-07-09)
------------------

- Pickle :class::`.HaproxyLogFile` data for faster performance.
  [gforcada]

- Add a way to negate the filters, so that instead of being able to filter by
  IP, it can output all but that IP information.
  [gforcada]

- Add lots of filters: ip, path, ssl, backend, frontend, server, status_code
  and so on. See ``--list-filters`` for a complete list of them.
  [gforcada]

- Add :method::`.HaproxyLogFile.parse_data` method to get data from data stream.
  It allows you use it as a library.
  [bogdangi]

- Add ``--list-filters`` argument on the command line interface.
  [gforcada]

- Add ``--filter`` argument on the command line interface, inspired by
  Bogdan's early design.
  [bogdangi] [gforcada]

- Create a new module :module::`haproxy.filters` that holds all available filters.
  [gforcada]

- Improve :method::`.HaproxyLogFile.cmd_queue_peaks` output to not only show
  peaks but also when requests started to queue and when they finished and
  the amount of requests that had been queued.
  [gforcada]

- Show help when no argument is given.
  [gforcada]

- Polish documentation and docstrings here and there.
  [gforcada]

- Add a ``--list-commands`` argument on the command line interface.
  [gforcada]

- Generate an API doc for ``HaproxyLogLine`` and ``HaproxyLogFile``.
  [bogdangi]

- Create a ``console_script`` `haproxy_log_analysis` for ease of use.
  [bogdangi]

- Add Sphinx documentation system, still empty.
  [gforcada]

- Keep valid log lines sorted so that the exact order of connections is kept.
  [gforcada]

- Add quite a few commands, see `README.rst`_ for a complete list of them.
  [gforcada]

- Run commands passed as arguments (with -c flag).
  [gforcada]

- Add a requirements.txt file to keep track of dependencies and pin them.
  [gforcada]

- Add travis_ and coveralls_ support. See its badges on `README.rst`_.
  [gforcada]

- Add argument parsing and custom validation logic for all arguments.
  [gforcada]

- Add regular expressions for haproxy log lines (HTTP format) and to
  parse HTTP requests path.
  Added tests to ensure they work as expected.
  [gforcada]

- Create distribution.
  [gforcada]

.. _travis: https://travis-ci.org/
.. _coveralls: https://coveralls.io/
.. _README.rst: http://github.com/gforcada/haproxy_log_analysis


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/gforcada/haproxy_log_analysis",
    "name": "haproxy-log-analysis",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "",
    "keywords": "haproxy log analysis report",
    "author": "Gil Forcada",
    "author_email": "gil.gnome@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/30/e4/a60383d4b86e462e74c18bbc5a5ae4a6470a1f67dc3814ac10a901fa241f/haproxy_log_analysis-5.1.0.tar.gz",
    "platform": null,
    "description": "\n.. -*- coding: utf-8 -*-\n\nHAProxy log analyzer\n====================\nThis Python package is a `HAProxy`_ log parser.\nIt analyzes HAProxy log files in multiple ways (see commands section below).\n\n.. note::\n   Currently only the `HTTP log format`_ is supported.\n\nTests and coverage\n------------------\nNo project is trustworthy if does not have tests and a decent coverage!\n\n.. image:: https://github.com/gforcada/haproxy_log_analysis/actions/workflows/tests.yml/badge.svg?branch=master\n   :target: https://github.com/gforcada/haproxy_log_analysis/actions/workflows/tests.yml\n\n.. image:: https://coveralls.io/repos/github/gforcada/haproxy_log_analysis/badge.svg?branch=master\n   :target: https://coveralls.io/github/gforcada/haproxy_log_analysis?branch=master\n\n\nDocumentation\n-------------\nSee the `documentation and API`_ at ReadTheDocs_.\n\nCommand-line interface\n----------------------\nThe current ``--help`` looks like this::\n\n  usage: haproxy_log_analysis [-h] [-l LOG] [-s START] [-d DELTA] [-c COMMAND]\n                              [-f FILTER] [-n] [--list-commands]\n                              [--list-filters] [--json]\n\n  Analyze HAProxy log files and outputs statistics about it\n\n  optional arguments:\n    -h, --help            show this help message and exit\n    -l LOG, --log LOG     HAProxy log file to analyze\n    -s START, --start START\n                          Process log entries starting at this time, in HAProxy\n                          date format (e.g. 11/Dec/2013 or\n                          11/Dec/2013:19:31:41). At least provide the\n                          day/month/year. Values not specified will use their\n                          base value (e.g. 00 for hour). Use in conjunction with\n                          -d to limit the number of entries to process.\n    -d DELTA, --delta DELTA\n                          Limit the number of entries to process. Express the\n                          time delta as a number and a time unit, e.g.: 1s, 10m,\n                          3h or 4d (for 1 second, 10 minutes, 3 hours or 4\n                          days). Use in conjunction with -s to only analyze\n                          certain time delta. If no start time is given, the\n                          time on the first line will be used instead.\n    -c COMMAND, --command COMMAND\n                          List of commands, comma separated, to run on the log\n                          file. See --list-commands to get a full list of them.\n    -f FILTER, --filter FILTER\n                          List of filters to apply on the log file. Passed as\n                          comma separated and parameters within square brackets,\n                          e.g ip[192.168.1.1],ssl,path[/some/path]. See --list-\n                          filters to get a full list of them.\n    -n, --negate-filter   Make filters passed with -f work the other way around,\n                          i.e. if the ``ssl`` filter is passed instead of\n                          showing only ssl requests it will show non-ssl\n                          traffic. If the ``ip`` filter is used, then all but\n                          that ip passed to the filter will be used.\n    --list-commands       Lists all commands available.\n    --list-filters        Lists all filters available.\n    --json                Output results in json.\n    --invalid             Print the lines that could not be parsed. Be aware\n                          that mixing it with the print command will mix their\n                          output.\n\n\nCommands\n--------\n\nCommands are small purpose specific programs in themselves that report specific statistics about the log file being analyzed.\nSee them all with ``--list-commands`` or online at https://haproxy-log-analyzer.readthedocs.io/modules.html#module-haproxy.commands.\n\n- ``average_response_time``\n- ``average_waiting_time``\n- ``connection_type``\n- ``counter``\n- ``http_methods``\n- ``ip_counter``\n- ``print``\n- ``queue_peaks``\n- ``request_path_counter``\n- ``requests_per_hour``\n- ``requests_per_minute``\n- ``server_load``\n- ``slow_requests``\n- ``slow_requests_counter``\n- ``status_codes_counter``\n- ``top_ips``\n- ``top_request_paths``\n\nFilters\n-------\nFilters, contrary to commands,\nare a way to reduce the amount of log lines to be processed.\n\n.. note::\n   The ``-n`` command line argument allows to reverse filters output.\n\n   This helps when looking for specific traces, like a certain IP, a path...\n\nSee them all with ``--list-filters`` or online at https://haproxy-log-analyzer.readthedocs.io/modules.html#module-haproxy.filters.\n\n- ``backend``\n- ``frontend``\n- ``http_method``\n- ``ip``\n- ``ip_range``\n- ``path``\n- ``response_size``\n- ``server``\n- ``slow_requests``\n- ``ssl``\n- ``status_code``\n- ``status_code_family``\n- ``wait_on_queues``\n\nInstallation\n------------\nAfter installation you will have a console script `haproxy_log_analysis`::\n\n    $ pip install haproxy_log_analysis\n\nTODO\n----\n- add more commands: *(help appreciated)*\n\n  - reports on servers connection time\n  - reports on termination state\n  - reports around connections (active, frontend, backend, server)\n  - *your ideas here*\n\n- think of a way to show the commands output in a meaningful way\n\n- be able to specify an output format. For any command that makes sense (slow\n  requests for example) output the given fields for each log line (i.e.\n  acceptance date, path, downstream server, load at that time...)\n\n- *your ideas*\n\n.. _HAProxy: http://haproxy.1wt.eu/\n.. _HTTP log format: http://cbonte.github.io/haproxy-dconv/2.2/configuration.html#8.2.3\n.. _documentation and API: https://haproxy-log-analyzer.readthedocs.io/\n.. _ReadTheDocs: http://readthedocs.org\n\nCHANGES\n=======\n\n5.1.0 (2022-12-03)\n------------------\n\n- Only get the first IP from `X-Forwarded-For` header.\n  [gforcada]\n\n- Improve tests robustness.\n  [gforcada]\n\n- Fix `top_ips` and `top_request_paths` commands output.\n  They were showing all output, rather than only the top 10.\n  [gforcada]\n\n- Move `tests` folder to the top-level.\n  [gforcada]\n\n5.0.0 (2022-11-27)\n------------------\n\n- Drop testing on travis-ci.\n  [gforcada]\n\n- Use GitHub Actions.\n  [gforcada]\n\n- Format the code with `pyupgrade`, `black` and `isort`.\n  [gforcada]\n\n- Use `pip-tools` to keep dependencies locked.\n  [gforcada]\n\n- Bump python versions supported to 3.7-3.11 and pypy.\n  [gforcada]\n\n- Drop python 3.6 (EOL).\n  [gforcada]\n\n4.1.0 (2020-01-06)\n------------------\n\n- **New command:** ``requests_per_hour``.\n  Just like the ``requests_per_minute`` but with hour granularity.\n  Idea and first implementation done by ``valleedelisle``.\n  [gforcada]\n\n- Fix parsing truncated requests.\n  Idea and first implementation by ``vixns``.\n  [gforcada]\n\n4.0.0 (2020-01-06)\n------------------\n\n**BREAKING CHANGES:**\n\n- Complete rewrite to use almost no memory usage even on huge files.\n  [gforcada]\n\n- Add parallelization to make parsing faster by parsing multiple lines in parallel.\n  [gforcada]\n\n- Rename command ``counter_slow_requests`` to ``slow_requests_counter``,\n  so it is aligned with all other ``_counter`` commands.\n  [gforcada]\n\n- Changed the ``counter_invalid`` command to a new command line switch ``--invalid``.\n  [gforcada]\n\n**Regular changes:**\n\n- Drop Python 2 support, and test on Python 3.8.\n  [gforcada]\n\n- Remove the pickling support.\n  [gforcada]\n\n- Add `--json` output command line option.\n  [valleedelisle]\n\n3.0.0 (2019-06-10)\n------------------\n\n- Fix spelling.\n  [EdwardBetts]\n\n- Make ip_counter use client_ip per default.\n  [vixns]\n\n- Overhaul testing environment. Test on python 3.7 as well. Use black to format.\n  [gforcada]\n\n2.1 (2017-07-06)\n----------------\n- Enforce QA checks (flake8) on code.\n  All code has been updated to follow it.\n  [gforcada]\n\n- Support Python 3.6.\n  [gforcada]\n\n- Support different syslog timestamps (at least NixOS).\n  [gforcada]\n\n2.0.2 (2016-11-17)\n------------------\n\n- Improve performance for ``cmd_print``.\n  [kevinjqiu]\n\n2.0.1 (2016-10-29)\n------------------\n\n- Allow hostnames to have a dot in it.\n  [gforcada]\n\n2.0 (2016-07-06)\n----------------\n- Handle unparsable HTTP requests.\n  [gforcada]\n\n- Only test on python 2.7 and 3.5\n  [gforcada]\n\n2.0b0 (2016-04-18)\n------------------\n- Check the divisor before doing a division to not get ``ZeroDivisionError`` exceptions.\n  [gforcada]\n\n2.0a0 (2016-03-29)\n------------------\n- Major refactoring:\n\n  # Rename modules and classes:\n\n    - haproxy_logline -> line\n    - haproxy_logfile -> logfile\n    - HaproxyLogLine -> Line\n    - HaproxyLogFile -> Log\n\n  # Parse the log file on Log() creation (i.e. in its __init__)\n\n  [gforcada]\n\n1.3 (2016-03-29)\n----------------\n\n- New filter: ``filter_wait_on_queues``.\n  Get all requests that waited at maximum X amount of milliseconds on HAProxy queues.\n  [gforcada]\n\n- Code/docs cleanups and add code analysis.\n  [gforcada]\n\n- Avoid using eval.\n  [gforcada]\n\n1.2.1 (2016-02-23)\n------------------\n\n- Support -1 as a status_code\n  [Christopher Baines]\n\n1.2 (2015-12-07)\n----------------\n\n- Allow a hostname on the syslog part (not only IPs)\n  [danny crasto]\n\n1.1 (2015-04-19)\n----------------\n\n- Make syslog optional.\n  Fixes issue https://github.com/gforcada/haproxy_log_analysis/issues/10.\n  [gforcada]\n\n1.0 (2015-03-24)\n----------------\n\n- Fix issue #9.\n  log line on the syslog part was too strict,\n  it was expecting the hostname to be a string and was\n  failing if it was an IP.\n  [gforcada]\n\n0.0.3.post2 (2015-01-05)\n------------------------\n\n- Finally really fixed issue #7.\n  ``namespace_packages`` was not meant to be on setup.py at all.\n  Silly copy&paste mistake.\n  [gforcada]\n\n0.0.3.post (2015-01-04)\n-----------------------\n\n- Fix release on PyPI.\n  Solves GitHub issue #7.\n  https://github.com/gforcada/haproxy_log_analysis/issues/7\n  [gforcada]\n\n0.0.3 (2014-07-09)\n------------------\n\n- Fix release on PyPI (again).\n  [gforcada]\n\n0.0.2 (2014-07-09)\n------------------\n\n- Fix release on PyPI.\n  [gforcada]\n\n0.0.1 (2014-07-09)\n------------------\n\n- Pickle :class::`.HaproxyLogFile` data for faster performance.\n  [gforcada]\n\n- Add a way to negate the filters, so that instead of being able to filter by\n  IP, it can output all but that IP information.\n  [gforcada]\n\n- Add lots of filters: ip, path, ssl, backend, frontend, server, status_code\n  and so on. See ``--list-filters`` for a complete list of them.\n  [gforcada]\n\n- Add :method::`.HaproxyLogFile.parse_data` method to get data from data stream.\n  It allows you use it as a library.\n  [bogdangi]\n\n- Add ``--list-filters`` argument on the command line interface.\n  [gforcada]\n\n- Add ``--filter`` argument on the command line interface, inspired by\n  Bogdan's early design.\n  [bogdangi] [gforcada]\n\n- Create a new module :module::`haproxy.filters` that holds all available filters.\n  [gforcada]\n\n- Improve :method::`.HaproxyLogFile.cmd_queue_peaks` output to not only show\n  peaks but also when requests started to queue and when they finished and\n  the amount of requests that had been queued.\n  [gforcada]\n\n- Show help when no argument is given.\n  [gforcada]\n\n- Polish documentation and docstrings here and there.\n  [gforcada]\n\n- Add a ``--list-commands`` argument on the command line interface.\n  [gforcada]\n\n- Generate an API doc for ``HaproxyLogLine`` and ``HaproxyLogFile``.\n  [bogdangi]\n\n- Create a ``console_script`` `haproxy_log_analysis` for ease of use.\n  [bogdangi]\n\n- Add Sphinx documentation system, still empty.\n  [gforcada]\n\n- Keep valid log lines sorted so that the exact order of connections is kept.\n  [gforcada]\n\n- Add quite a few commands, see `README.rst`_ for a complete list of them.\n  [gforcada]\n\n- Run commands passed as arguments (with -c flag).\n  [gforcada]\n\n- Add a requirements.txt file to keep track of dependencies and pin them.\n  [gforcada]\n\n- Add travis_ and coveralls_ support. See its badges on `README.rst`_.\n  [gforcada]\n\n- Add argument parsing and custom validation logic for all arguments.\n  [gforcada]\n\n- Add regular expressions for haproxy log lines (HTTP format) and to\n  parse HTTP requests path.\n  Added tests to ensure they work as expected.\n  [gforcada]\n\n- Create distribution.\n  [gforcada]\n\n.. _travis: https://travis-ci.org/\n.. _coveralls: https://coveralls.io/\n.. _README.rst: http://github.com/gforcada/haproxy_log_analysis\n\n",
    "bugtrack_url": null,
    "license": "GPL v3",
    "summary": "Haproxy log analyzer that tries to gives an insight of what's going on",
    "version": "5.1.0",
    "project_urls": {
        "Homepage": "https://github.com/gforcada/haproxy_log_analysis"
    },
    "split_keywords": [
        "haproxy",
        "log",
        "analysis",
        "report"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "00d049c3027c08063975da5b541592956b694e432a1b2936fbac8c03d1fae51a",
                "md5": "a27bb97db63d0855563d27673bf7c822",
                "sha256": "b11a507f69cccdf0b3cf64f6e9f7f2eca2a2d0bd0c69e14ae028214e17cd3123"
            },
            "downloads": -1,
            "filename": "haproxy_log_analysis-5.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a27bb97db63d0855563d27673bf7c822",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 47259,
            "upload_time": "2022-12-03T14:02:19",
            "upload_time_iso_8601": "2022-12-03T14:02:19.085003Z",
            "url": "https://files.pythonhosted.org/packages/00/d0/49c3027c08063975da5b541592956b694e432a1b2936fbac8c03d1fae51a/haproxy_log_analysis-5.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "30e4a60383d4b86e462e74c18bbc5a5ae4a6470a1f67dc3814ac10a901fa241f",
                "md5": "9bc5aeb47013cf57ee06122182e0c66b",
                "sha256": "9eb326250a06145666fa6498c0ecf04f3123ed8a3ffdb472f60612d6b8956ed3"
            },
            "downloads": -1,
            "filename": "haproxy_log_analysis-5.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "9bc5aeb47013cf57ee06122182e0c66b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 55271,
            "upload_time": "2022-12-03T14:02:21",
            "upload_time_iso_8601": "2022-12-03T14:02:21.249588Z",
            "url": "https://files.pythonhosted.org/packages/30/e4/a60383d4b86e462e74c18bbc5a5ae4a6470a1f67dc3814ac10a901fa241f/haproxy_log_analysis-5.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2022-12-03 14:02:21",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "gforcada",
    "github_project": "haproxy_log_analysis",
    "travis_ci": false,
    "coveralls": true,
    "github_actions": true,
    "requirements": [
        {
            "name": "attrs",
            "specs": [
                [
                    "==",
                    "22.1.0"
                ]
            ]
        },
        {
            "name": "coverage",
            "specs": [
                [
                    "==",
                    "6.5.0"
                ]
            ]
        },
        {
            "name": "exceptiongroup",
            "specs": [
                [
                    "==",
                    "1.1.3"
                ]
            ]
        },
        {
            "name": "iniconfig",
            "specs": [
                [
                    "==",
                    "1.1.1"
                ]
            ]
        },
        {
            "name": "packaging",
            "specs": [
                [
                    "==",
                    "21.3"
                ]
            ]
        },
        {
            "name": "pluggy",
            "specs": [
                [
                    "==",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "pyparsing",
            "specs": [
                [
                    "==",
                    "3.0.9"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": [
                [
                    "==",
                    "7.2.0"
                ]
            ]
        },
        {
            "name": "pytest-cov",
            "specs": [
                [
                    "==",
                    "4.0.0"
                ]
            ]
        },
        {
            "name": "tomli",
            "specs": [
                [
                    "==",
                    "2.0.1"
                ]
            ]
        }
    ],
    "tox": true,
    "lcname": "haproxy-log-analysis"
}
        
Elapsed time: 0.62278s