PYESHandler


NamePYESHandler JSON
Version 1.0.2 PyPI version JSON
download
home_pagehttps://github.com/mkhadher/python-elasticsearch-logger
SummaryElasticsearch Log handler for the logging library
upload_time2024-02-15 09:43:51
maintainer
docs_urlNone
authorMohammed Mousa
requires_python
licenseApache2
keywords logging elasticsearch handler log django and flask instrumentation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI
coveralls test coverage No coveralls.
            Python Elasticsearch Log handler
********************************
This library provides an Elasticsearch logging appender compatible with the python standard `logging <https://docs.python.org/2/library/logging.html>`_ library.

This library is a fork of `https://github.com/cmanaha/python-elasticsearch-logger <https://github.com/cmanaha/python-elasticsearch-logger>`_ Thanks to Carlos Manzanedo Rueda for starting this library.

The code source is in github at `https://github.com/mkhadher/python-elasticsearch-logger <https://github.com/mkhadher/python-elasticsearch-logger>`_

Installation
============
Install using pip

.. code-block::

    pip install PYESHandler

Requirements Python 2
=====================
This library requires the following dependencies
 - elasticsearch
 - requests
 - enum

Requirements Python 3
=====================
This library requires the following dependencies
 - elasticsearch
 - requests

Additional requirements for Kerberos support
============================================
Additionally, the package support optionally kerberos authentication by adding the following dependecy
 - requests-kerberos

Additional requirements for AWS IAM user authentication (request signing)
=========================================================================
Additionally, the package support optionally AWS IAM user authentication by adding the following dependecy
 - requests-aws4auth

Using the handler in  your program
==================================
To initialise and create the handler, just add the handler to your logger as follow ::

    from pyeslogging.handlers import PYESHandler
    handler = PYESHandler(hosts=[{'host': 'localhost', 'port': 9200}],
                               auth_type=PYESHandler.AuthType.NO_AUTH,
                               es_index_name="my_python_index")
    log = logging.getLogger("PythonTest")
    log.setLevel(logging.INFO)
    log.addHandler(handler)

You can add fields upon initialisation, providing more data of the execution context ::

    from pyeslogging.handlers import PYESHandler
    handler = PYESHandler(hosts=[{'host': 'localhost', 'port': 9200}],
                               auth_type=PYESHandler.AuthType.NO_AUTH,
                               es_index_name="my_python_index",
                               es_additional_fields={'App': 'MyAppName', 'Environment': 'Dev'})
    log = logging.getLogger("PythonTest")
    log.setLevel(logging.INFO)
    log.addHandler(handler)

This additional fields will be applied to all logging fields and recorded in elasticsearch

To log, use the regular commands from the logging library ::

    log.info("This is an info statement that will be logged into elasticsearch")

Your code can also dump additional extra fields on a per log basis that can be used to instrument
operations. For example, when reading information from a database you could do something like::

    start_time = time.time()
    database_operation()
    db_delta = time.time() - start_time
    log.debug("DB operation took %.3f seconds" % db_delta, extra={'db_execution_time': db_delta})

The code above executes the DB operation, measures the time it took and logs an entry that contains
in the message the time the operation took as string and for convenience, it creates another field
called db_execution_time with a float that can be used to plot the time this operations are taking using
Kibana on top of elasticsearch

Initialisation parameters
=========================
The constructors takes the following parameters:
 - hosts:  The list of hosts that elasticsearch clients will connect, multiple hosts are allowed, for example ::

    [{'host':'host1','port':9200}, {'host':'host2','port':9200}]


 - auth_type: The authentication currently support PYESHandler.AuthType = NO_AUTH, BASIC_AUTH, KERBEROS_AUTH
 - auth_details: When PYESHandler.AuthType.BASIC_AUTH or "BASIC_AUTH" string is used this argument must contain a tuple of string with the user and password that will be used to authenticate against the Elasticsearch servers, for example a tuple  ('User','Password') or a dictionary {"username": "my_username","password": "my_fancy_password"}
 - aws_secret_key: When ``PYESHandler.AuthType.AWS_SIGNED_AUTH`` or "AWS_SIGNED_AUTH" string is used this argument must contain the AWS secret key of the  the AWS IAM user
 - aws_region: When ``PYESHandler.AuthType.AWS_SIGNED_AUTH`` or "AWS_SIGNED_AUTH" string is used this argument must contain the AWS region of the  the AWS Elasticsearch servers, for example ``'us-east'``
 - use_ssl: A boolean that defines if the communications should use SSL encrypted communication
 - verify_ssl: A boolean that defines if the SSL certificates are validated or not
 - buffer_size: An int, Once this size is reached on the internal buffer results are flushed into ES
 - flush_frequency_in_sec: A float representing how often and when the buffer will be flushed
 - es_index_name: A string with the prefix of the elasticsearch index that will be created. Note a date with
   YYYY.MM.dd, ``python_logger`` used by default
 - index_name_frequency: The frequency to use as part of the index naming. Currently supports
   ``ElasticECSHandler.IndexNameFrequency.DAILY``, ``ElasticECSHandler.IndexNameFrequency.WEEKLY``,
   ``ElasticECSHandler.IndexNameFrequency.MONTHLY``, ``ElasticECSHandler.IndexNameFrequency.YEARLY`` and
   ``ElasticECSHandler.IndexNameFrequency.NEVER``. By default the daily rotation is used.
   is used
 - es_doc_type: A string with the name of the document type that will be used ``python_log`` used by default
 - es_additional_fields: A dictionary with all the additional fields that you would like to add to the logs

Django Integration
==================
It is also very easy to integrate the handler to `Django <https://www.djangoproject.com/>`_ And what is even
better, at DEBUG level django logs information such as how long it takes for DB connections to return so
they can be plotted on Kibana, or the SQL statements that Django executed. ::

    from pyeslogging.handlers import PYESHandler
    LOGGING = {
        'version': 1,
        'disable_existing_loggers': False,
        'handlers': {
            'file': {
                'level': 'DEBUG',
                'class': 'logging.handlers.RotatingFileHandler',
                'filename': './debug.log',
                'maxBytes': 102400,
                'backupCount': 5,
            },
            'elasticsearch': {
                'level': 'DEBUG',
                'class': 'pyeslogging.handlers.PYESHandler',
                'hosts': [{'host': 'localhost', 'port': 9200}],
                'es_index_name': 'my_python_app',
                'es_additional_fields': {'App': 'Test', 'Environment': 'Dev'},
                'auth_type': PYESHandler.AuthType.NO_AUTH,
                'use_ssl': False,
            },
        },
        'loggers': {
            'django': {
                'handlers': ['file','elasticsearch'],
                'level': 'DEBUG',
                'propagate': True,
            },
        },
    }

There is more information about how Django logging works in the
`Django documentation <https://docs.djangoproject.com/en/1.9/topics/logging//>`_

Read logging JSON Config File
=============================
The below example can be used by a flask app or any python script. This example shows how to configure logging logger for file and elasticsearch logging using logging json config file.

``logging.json`` ::

    {
       "version":1,
       "disable_existing_loggers":true,
       "formatters":{
          "standard":{
             "format":"%(asctime)s - %(levelname)s - %(threadName)s - %(name)s - %(message)s"
          }
       },
       "handlers":{
          "file":{
             "level":"DEBUG",
             "class":"logging.handlers.TimedRotatingFileHandler",
             "formatter":"standard",
             "filename":"./log_file.txt",
             "when":"midnight",
             "backupCount":25
          },
          "elasticsearch":{
             "level":"DEBUG",
             "class":"pyeslogging.handlers.PYESHandler",
             "hosts": [{"host": "my.elastic.domain.com", "port": 9243}],
             "es_index_name":"PYESLogger",
             "auth_type": "BASIC_AUTH",
             "auth_details": {"username": "my_username","password": "my_fancy_password"},
             "use_ssl":true,
             "index_name_frequency": "monthly",
             "verify_ssl": true
          }
       },
       "root":{
          "handlers":[
             "file",
             "elasticsearch"
          ],
          "level":"DEBUG",
          "propagate":false
       }
    }

``app.py`` ::

    import logging
    import logging.config
    from pyeslogging.handlers import PYESHandler
    import json

    # Define logging.json path
    with open("C:\App\logging.json") as read_file:
        loggingConfigDir = json.load(read_file)
    logging.config.dictConfig(loggingConfigDir)
    logger = logging.getLogger('root')
    logger.info("Hello World !")

Building the sources & Testing
------------------------------
To create the package follow the standard python setup.py to compile.
To test, just execute the python tests within the test folder

Why using an appender rather than logstash or beats
---------------------------------------------------
In some cases is quite useful to provide all the information available within the LogRecords as it contains
things such as exception information, the method, file, log line where the log was generated.


The same functionality can be implemented in many other different ways. For example, consider the integration
using `SysLogHandler <https://docs.python.org/3/library/logging.handlers.html#sysloghandler>`_ and
`logstash syslog plugin <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html>`_.


Contributing back
-----------------
Feel free to use this as is or even better, feel free to fork and send your pull requests over.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/mkhadher/python-elasticsearch-logger",
    "name": "PYESHandler",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "logging elasticsearch handler log django and flask instrumentation",
    "author": "Mohammed Mousa",
    "author_email": "m.khadher@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/a0/b6/e6fd13bc834c3d6c30ae3d1d842ba26d955ce472450399c6985e06d0fa0f/PYESHandler-1.0.2.tar.gz",
    "platform": null,
    "description": "Python Elasticsearch Log handler\n********************************\nThis library provides an Elasticsearch logging appender compatible with the python standard `logging <https://docs.python.org/2/library/logging.html>`_ library.\n\nThis library is a fork of `https://github.com/cmanaha/python-elasticsearch-logger <https://github.com/cmanaha/python-elasticsearch-logger>`_ Thanks to Carlos Manzanedo Rueda for starting this library.\n\nThe code source is in github at `https://github.com/mkhadher/python-elasticsearch-logger <https://github.com/mkhadher/python-elasticsearch-logger>`_\n\nInstallation\n============\nInstall using pip\n\n.. code-block::\n\n    pip install PYESHandler\n\nRequirements Python 2\n=====================\nThis library requires the following dependencies\n - elasticsearch\n - requests\n - enum\n\nRequirements Python 3\n=====================\nThis library requires the following dependencies\n - elasticsearch\n - requests\n\nAdditional requirements for Kerberos support\n============================================\nAdditionally, the package support optionally kerberos authentication by adding the following dependecy\n - requests-kerberos\n\nAdditional requirements for AWS IAM user authentication (request signing)\n=========================================================================\nAdditionally, the package support optionally AWS IAM user authentication by adding the following dependecy\n - requests-aws4auth\n\nUsing the handler in  your program\n==================================\nTo initialise and create the handler, just add the handler to your logger as follow ::\n\n    from pyeslogging.handlers import PYESHandler\n    handler = PYESHandler(hosts=[{'host': 'localhost', 'port': 9200}],\n                               auth_type=PYESHandler.AuthType.NO_AUTH,\n                               es_index_name=\"my_python_index\")\n    log = logging.getLogger(\"PythonTest\")\n    log.setLevel(logging.INFO)\n    log.addHandler(handler)\n\nYou can add fields upon initialisation, providing more data of the execution context ::\n\n    from pyeslogging.handlers import PYESHandler\n    handler = PYESHandler(hosts=[{'host': 'localhost', 'port': 9200}],\n                               auth_type=PYESHandler.AuthType.NO_AUTH,\n                               es_index_name=\"my_python_index\",\n                               es_additional_fields={'App': 'MyAppName', 'Environment': 'Dev'})\n    log = logging.getLogger(\"PythonTest\")\n    log.setLevel(logging.INFO)\n    log.addHandler(handler)\n\nThis additional fields will be applied to all logging fields and recorded in elasticsearch\n\nTo log, use the regular commands from the logging library ::\n\n    log.info(\"This is an info statement that will be logged into elasticsearch\")\n\nYour code can also dump additional extra fields on a per log basis that can be used to instrument\noperations. For example, when reading information from a database you could do something like::\n\n    start_time = time.time()\n    database_operation()\n    db_delta = time.time() - start_time\n    log.debug(\"DB operation took %.3f seconds\" % db_delta, extra={'db_execution_time': db_delta})\n\nThe code above executes the DB operation, measures the time it took and logs an entry that contains\nin the message the time the operation took as string and for convenience, it creates another field\ncalled db_execution_time with a float that can be used to plot the time this operations are taking using\nKibana on top of elasticsearch\n\nInitialisation parameters\n=========================\nThe constructors takes the following parameters:\n - hosts:  The list of hosts that elasticsearch clients will connect, multiple hosts are allowed, for example ::\n\n    [{'host':'host1','port':9200}, {'host':'host2','port':9200}]\n\n\n - auth_type: The authentication currently support PYESHandler.AuthType = NO_AUTH, BASIC_AUTH, KERBEROS_AUTH\n - auth_details: When PYESHandler.AuthType.BASIC_AUTH or \"BASIC_AUTH\" string is used this argument must contain a tuple of string with the user and password that will be used to authenticate against the Elasticsearch servers, for example a tuple  ('User','Password') or a dictionary {\"username\": \"my_username\",\"password\": \"my_fancy_password\"}\n - aws_secret_key: When ``PYESHandler.AuthType.AWS_SIGNED_AUTH`` or \"AWS_SIGNED_AUTH\" string is used this argument must contain the AWS secret key of the  the AWS IAM user\n - aws_region: When ``PYESHandler.AuthType.AWS_SIGNED_AUTH`` or \"AWS_SIGNED_AUTH\" string is used this argument must contain the AWS region of the  the AWS Elasticsearch servers, for example ``'us-east'``\n - use_ssl: A boolean that defines if the communications should use SSL encrypted communication\n - verify_ssl: A boolean that defines if the SSL certificates are validated or not\n - buffer_size: An int, Once this size is reached on the internal buffer results are flushed into ES\n - flush_frequency_in_sec: A float representing how often and when the buffer will be flushed\n - es_index_name: A string with the prefix of the elasticsearch index that will be created. Note a date with\n   YYYY.MM.dd, ``python_logger`` used by default\n - index_name_frequency: The frequency to use as part of the index naming. Currently supports\n   ``ElasticECSHandler.IndexNameFrequency.DAILY``, ``ElasticECSHandler.IndexNameFrequency.WEEKLY``,\n   ``ElasticECSHandler.IndexNameFrequency.MONTHLY``, ``ElasticECSHandler.IndexNameFrequency.YEARLY`` and\n   ``ElasticECSHandler.IndexNameFrequency.NEVER``. By default the daily rotation is used.\n   is used\n - es_doc_type: A string with the name of the document type that will be used ``python_log`` used by default\n - es_additional_fields: A dictionary with all the additional fields that you would like to add to the logs\n\nDjango Integration\n==================\nIt is also very easy to integrate the handler to `Django <https://www.djangoproject.com/>`_ And what is even\nbetter, at DEBUG level django logs information such as how long it takes for DB connections to return so\nthey can be plotted on Kibana, or the SQL statements that Django executed. ::\n\n    from pyeslogging.handlers import PYESHandler\n    LOGGING = {\n        'version': 1,\n        'disable_existing_loggers': False,\n        'handlers': {\n            'file': {\n                'level': 'DEBUG',\n                'class': 'logging.handlers.RotatingFileHandler',\n                'filename': './debug.log',\n                'maxBytes': 102400,\n                'backupCount': 5,\n            },\n            'elasticsearch': {\n                'level': 'DEBUG',\n                'class': 'pyeslogging.handlers.PYESHandler',\n                'hosts': [{'host': 'localhost', 'port': 9200}],\n                'es_index_name': 'my_python_app',\n                'es_additional_fields': {'App': 'Test', 'Environment': 'Dev'},\n                'auth_type': PYESHandler.AuthType.NO_AUTH,\n                'use_ssl': False,\n            },\n        },\n        'loggers': {\n            'django': {\n                'handlers': ['file','elasticsearch'],\n                'level': 'DEBUG',\n                'propagate': True,\n            },\n        },\n    }\n\nThere is more information about how Django logging works in the\n`Django documentation <https://docs.djangoproject.com/en/1.9/topics/logging//>`_\n\nRead logging JSON Config File\n=============================\nThe below example can be used by a flask app or any python script. This example shows how to configure logging logger for file and elasticsearch logging using logging json config file.\n\n``logging.json`` ::\n\n    {\n       \"version\":1,\n       \"disable_existing_loggers\":true,\n       \"formatters\":{\n          \"standard\":{\n             \"format\":\"%(asctime)s - %(levelname)s - %(threadName)s - %(name)s - %(message)s\"\n          }\n       },\n       \"handlers\":{\n          \"file\":{\n             \"level\":\"DEBUG\",\n             \"class\":\"logging.handlers.TimedRotatingFileHandler\",\n             \"formatter\":\"standard\",\n             \"filename\":\"./log_file.txt\",\n             \"when\":\"midnight\",\n             \"backupCount\":25\n          },\n          \"elasticsearch\":{\n             \"level\":\"DEBUG\",\n             \"class\":\"pyeslogging.handlers.PYESHandler\",\n             \"hosts\": [{\"host\": \"my.elastic.domain.com\", \"port\": 9243}],\n             \"es_index_name\":\"PYESLogger\",\n             \"auth_type\": \"BASIC_AUTH\",\n             \"auth_details\": {\"username\": \"my_username\",\"password\": \"my_fancy_password\"},\n             \"use_ssl\":true,\n             \"index_name_frequency\": \"monthly\",\n             \"verify_ssl\": true\n          }\n       },\n       \"root\":{\n          \"handlers\":[\n             \"file\",\n             \"elasticsearch\"\n          ],\n          \"level\":\"DEBUG\",\n          \"propagate\":false\n       }\n    }\n\n``app.py`` ::\n\n    import logging\n    import logging.config\n    from pyeslogging.handlers import PYESHandler\n    import json\n\n    # Define logging.json path\n    with open(\"C:\\App\\logging.json\") as read_file:\n        loggingConfigDir = json.load(read_file)\n    logging.config.dictConfig(loggingConfigDir)\n    logger = logging.getLogger('root')\n    logger.info(\"Hello World !\")\n\nBuilding the sources & Testing\n------------------------------\nTo create the package follow the standard python setup.py to compile.\nTo test, just execute the python tests within the test folder\n\nWhy using an appender rather than logstash or beats\n---------------------------------------------------\nIn some cases is quite useful to provide all the information available within the LogRecords as it contains\nthings such as exception information, the method, file, log line where the log was generated.\n\n\nThe same functionality can be implemented in many other different ways. For example, consider the integration\nusing `SysLogHandler <https://docs.python.org/3/library/logging.handlers.html#sysloghandler>`_ and\n`logstash syslog plugin <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html>`_.\n\n\nContributing back\n-----------------\nFeel free to use this as is or even better, feel free to fork and send your pull requests over.\n\n",
    "bugtrack_url": null,
    "license": "Apache2",
    "summary": "Elasticsearch Log handler for the logging library",
    "version": "1.0.2",
    "project_urls": {
        "Homepage": "https://github.com/mkhadher/python-elasticsearch-logger"
    },
    "split_keywords": [
        "logging",
        "elasticsearch",
        "handler",
        "log",
        "django",
        "and",
        "flask",
        "instrumentation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0ed0f17405966dc34587d64010d79a41d4414c7b4301a9755fb26602f3bd67ca",
                "md5": "59bdf2a4187e7580371e9102ca8e4360",
                "sha256": "591776f2c33a159d8aaaf809b9a2594dab7906a4dc2f6d17f59bbab5c316023f"
            },
            "downloads": -1,
            "filename": "PYESHandler-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "59bdf2a4187e7580371e9102ca8e4360",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 13083,
            "upload_time": "2024-02-15T09:43:48",
            "upload_time_iso_8601": "2024-02-15T09:43:48.835305Z",
            "url": "https://files.pythonhosted.org/packages/0e/d0/f17405966dc34587d64010d79a41d4414c7b4301a9755fb26602f3bd67ca/PYESHandler-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a0b6e6fd13bc834c3d6c30ae3d1d842ba26d955ce472450399c6985e06d0fa0f",
                "md5": "0d01b84799ea6d7b77dbb91543038283",
                "sha256": "041804615292c004f96adc57955811c598fa561e0d7707bed0bc6afb4421b500"
            },
            "downloads": -1,
            "filename": "PYESHandler-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "0d01b84799ea6d7b77dbb91543038283",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 12598,
            "upload_time": "2024-02-15T09:43:51",
            "upload_time_iso_8601": "2024-02-15T09:43:51.294436Z",
            "url": "https://files.pythonhosted.org/packages/a0/b6/e6fd13bc834c3d6c30ae3d1d842ba26d955ce472450399c6985e06d0fa0f/PYESHandler-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-15 09:43:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mkhadher",
    "github_project": "python-elasticsearch-logger",
    "travis_ci": true,
    "coveralls": false,
    "github_actions": false,
    "tox": true,
    "lcname": "pyeshandler"
}
        
Elapsed time: 0.19057s