># Origin project location.
> Please note that this repo is a fork from: https://github.com/cmanaha/python-elasticsearch-logger
>
> Base code is extended by:
> - adding the HttpNtlmAuth as new authentication
> - add support for elasticsearch 8.x version
> - those changes are waiting for review in:
> - https://github.com/cmanaha/python-elasticsearch-logger/pull/94
> - https://github.com/cmanaha/python-elasticsearch-logger/pull/93
___
[![downloads](https://img.shields.io/pypi/dd/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)
[![versions](https://img.shields.io/pypi/pyversions/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)
[![status](https://img.shields.io/pypi/status/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)
[![license](https://img.shields.io/pypi/l/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)
[![ci_status](https://travis-ci.org/cmanaha/python-elasticsearch-logger.svg?branch=master)](https://travis-ci.org/cmanaha/python-elasticsearch-logger)
[![codecov](https://codecov.io/github/cmanaha/python-elasticsearch-logger/coverage.svg?branch=master)](http://codecov.io/github/cmanaha/python-elasticsearch-logger?branch=master)
[![gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/cmanaha/python-elasticsearch-logger?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
# Python Elasticsearch Log handler
This library provides an Elasticsearch logging appender compatible with the
python standard `logging <https://docs.python.org/2/library/logging.html>`_ library.
The code source is in github at `https://github.com/cmanaha/python-elasticsearch-logger
<https://github.com/cmanaha/python-elasticsearch-logger>`_
## Installation
Install using pip::
pip install CMRESHandler-NtlmAuth
## Requirements Python 3
This library requires the following dependencies
- elasticsearch
- requests
### Additional requirements for Kerberos support
Additionally, the package support optionally kerberos authentication by adding the following dependency
- requests-kerberos
### Additional requirements for NTLM support
Additionally, the package support optionally NTLM authentication by adding the following dependency
- requests-ntlm
### Additional requirements for AWS IAM user authentication (request signing)
Additionally, the package support optionally AWS IAM user authentication by adding the following dependency
- requests-aws4auth
# Using the handler in your program
To initialise and create the handler, just add the handler to your logger as follow ::
```python
from cmreslogging.handlers import CMRESHandler
handler = CMRESHandler(hosts=[{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],
auth_type=CMRESHandler.AuthType.NO_AUTH,
es_index_name="my_python_index")
log = logging.getLogger("PythonTest")
log.setLevel(logging.INFO)
log.addHandler(handler)
```
You can add fields upon initialization, providing more data of the execution context ::
```python
from cmreslogging.handlers import CMRESHandler
handler = CMRESHandler(hosts=[{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],
auth_type=CMRESHandler.AuthType.NO_AUTH,
es_index_name="my_python_index",
es_additional_fields={'App': 'MyAppName', 'Environment': 'Dev'})
log = logging.getLogger("PythonTest")
log.setLevel(logging.INFO)
log.addHandler(handler)
```
This additional fields will be applied to all logging fields and recorded in elasticsearch
To log, use the regular commands from the logging library ::
log.info("This is an info statement that will be logged into elasticsearch")
Your code can also dump additional extra fields on a per log basis that can be used to instrument
operations. For example, when reading information from a database you could do something like::
start_time = time.time()
database_operation()
db_delta = time.time() - start_time
log.debug("DB operation took %.3f seconds" % db_delta, extra={'db_execution_time': db_delta})
The code above executes the DB operation, measures the time it took and logs an entry that contains
in the message the time the operation took as string and for convenience, it creates another field
called db_execution_time with a float that can be used to plot the time this operations are taking using
Kibana on top of elasticsearch
# Initialization parameters
The constructors takes the following parameters:
- hosts: The list of hosts that elasticsearch clients will connect, multiple hosts are allowed.
Use ```'scheme'``` to determinate if use SSL (`use_ssl` is deprecated). To use SSL set ```'scheme': 'https'```, or if you don't need SSL Sset ```'scheme': 'http'```.
for example::
[{'host':'host1','port':9200, 'scheme': 'https'}, {'host':'host2','port':9200, 'scheme': 'http'}]
- auth_type: The authentication currently support CMRESHandler.AuthType = NO_AUTH, BASIC_AUTH, KERBEROS_AUTH, NTLM_AUTH
- auth_details: When ``CMRESHandler.AuthType.BASIC_AUTH`` or ``CMRESHandler.AuthType.NTLM_AUTH`` is used this argument must contain a tuple of string with the user and password that will be used to authenticate against the Elasticsearch servers, for example ('User','Password')
- aws_access_key: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS key id of the the AWS IAM user
- aws_secret_key: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS secret key of the the AWS IAM user
- aws_region: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS region of the the AWS Elasticsearch servers, for example ``'us-east'``
- verify_ssl: A boolean that defines if the SSL certificates are validated or not
- buffer_size: An int, Once this size is reached on the internal buffer results are flushed into ES
- flush_frequency_in_sec: A float representing how often and when the buffer will be flushed
- es_index_name: A string with the prefix of the elasticsearch index that will be created. Note a date with
YYYY.MM.dd, ``python_logger`` used by default
- index_name_frequency: The frequency to use as part of the index naming. Currently supports
CMRESHandler.IndexNameFrequency.DAILY, CMRESHandler.IndexNameFrequency.WEEKLY,
CMRESHandler.IndexNameFrequency.MONTHLY, CMRESHandler.IndexNameFrequency.YEARLY by default the daily rotation
is used
- es_additional_fields: A dictionary with all the additional fields that you would like to add to the logs
# Django Integration
It is also very easy to integrate the handler to `Django <https://www.djangoproject.com/>`_ And what is even
better, at DEBUG level django logs information such as how long it takes for DB connections to return so
they can be plotted on Kibana, or the SQL statements that Django executed.
```python
from cmreslogging.handlers import CMRESHandler
LOGGING = {
'version': 1,
'disable_existing_loggers': False,
'handlers': {
'file': {
'level': 'DEBUG',
'class': 'logging.handlers.RotatingFileHandler',
'filename': './debug.log',
'maxBytes': 102400,
'backupCount': 5,
},
'elasticsearch': {
'level': 'DEBUG',
'class': 'cmreslogging.handlers.CMRESHandler',
'hosts': [{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],
'es_index_name': 'my_python_app',
'es_additional_fields': {'App': 'Test', 'Environment': 'Dev'},
'auth_type': CMRESHandler.AuthType.NO_AUTH,
'use_ssl': False,
},
},
'loggers': {
'django': {
'handlers': ['file','elasticsearch'],
'level': 'DEBUG',
'propagate': True,
},
},
}
```
There is more information about how Django logging works in the
`Django documentation <https://docs.djangoproject.com/en/1.9/topics/logging//>`_
# Building the sources & Testing
To create the package follow the standard python setup.py to compile.
To test, just execute the python tests within the test folder
# Why using an appender rather than logstash or beats
In some cases is quite useful to provide all the information available within the LogRecords as it contains
things such as exception information, the method, file, log line where the log was generated.
If you are interested on understanding more about the differences between the agent vs handler
approach, I'd suggest reading `this conversation thread <https://github.com/cmanaha/python-elasticsearch-logger/issues/44/>`_
The same functionality can be implemented in many other different ways. For example, consider the integration
using `SysLogHandler <https://docs.python.org/3/library/logging.handlers.html#sysloghandler>`_ and
`logstash syslog plugin <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html>`_.
# Contributing back
Feel free to use this as is or even better, feel free to fork and send your pull requests over.
Raw data
{
"_id": null,
"home_page": "https://github.com/thoscilo/python-elasticsearch-logger/tree/release",
"name": "CMRESHandler-NtlmAuth",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": "logging elasticsearch handler log django instrumentation",
"author": "Carlos Manzanedo Rueda",
"author_email": "c.manaha@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/1f/78/6e1e044f5992f25a8f7dda66ad2e8bbc653417303d14bbde902ba4fa1018/CMRESHandler-NtlmAuth-1.1.0.tar.gz",
"platform": null,
"description": "># Origin project location.\r\n\r\n> Please note that this repo is a fork from: https://github.com/cmanaha/python-elasticsearch-logger\r\n\r\n>\r\n\r\n> Base code is extended by:\r\n\r\n> - adding the HttpNtlmAuth as new authentication\r\n\r\n> - add support for elasticsearch 8.x version\r\n\r\n> - those changes are waiting for review in:\r\n\r\n> - https://github.com/cmanaha/python-elasticsearch-logger/pull/94\r\n\r\n> - https://github.com/cmanaha/python-elasticsearch-logger/pull/93\r\n\r\n\r\n\r\n___\r\n\r\n\r\n\r\n[![downloads](https://img.shields.io/pypi/dd/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)\r\n\r\n[![versions](https://img.shields.io/pypi/pyversions/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)\r\n\r\n[![status](https://img.shields.io/pypi/status/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)\r\n\r\n[![license](https://img.shields.io/pypi/l/CMRESHandler.svg)](https://pypi.python.org/pypi/CMRESHandler)\r\n\r\n[![ci_status](https://travis-ci.org/cmanaha/python-elasticsearch-logger.svg?branch=master)](https://travis-ci.org/cmanaha/python-elasticsearch-logger)\r\n\r\n[![codecov](https://codecov.io/github/cmanaha/python-elasticsearch-logger/coverage.svg?branch=master)](http://codecov.io/github/cmanaha/python-elasticsearch-logger?branch=master)\r\n\r\n[![gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/cmanaha/python-elasticsearch-logger?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)\r\n\r\n\r\n\r\n\r\n\r\n# Python Elasticsearch Log handler\r\n\r\n\r\n\r\nThis library provides an Elasticsearch logging appender compatible with the\r\n\r\npython standard `logging <https://docs.python.org/2/library/logging.html>`_ library.\r\n\r\n\r\n\r\nThe code source is in github at `https://github.com/cmanaha/python-elasticsearch-logger\r\n\r\n<https://github.com/cmanaha/python-elasticsearch-logger>`_\r\n\r\n\r\n\r\n\r\n\r\n## Installation\r\n\r\n\r\n\r\nInstall using pip::\r\n\r\n\r\n\r\n pip install CMRESHandler-NtlmAuth\r\n\r\n\r\n\r\n## Requirements Python 3\r\n\r\n\r\n\r\nThis library requires the following dependencies\r\n\r\n - elasticsearch\r\n\r\n - requests\r\n\r\n\r\n\r\n### Additional requirements for Kerberos support\r\n\r\n\r\n\r\nAdditionally, the package support optionally kerberos authentication by adding the following dependency\r\n\r\n - requests-kerberos\r\n\r\n\r\n\r\n### Additional requirements for NTLM support\r\n\r\n\r\n\r\nAdditionally, the package support optionally NTLM authentication by adding the following dependency\r\n\r\n - requests-ntlm\r\n\r\n\r\n\r\n### Additional requirements for AWS IAM user authentication (request signing)\r\n\r\n\r\n\r\nAdditionally, the package support optionally AWS IAM user authentication by adding the following dependency\r\n\r\n - requests-aws4auth\r\n\r\n\r\n\r\n# Using the handler in your program\r\n\r\n\r\n\r\nTo initialise and create the handler, just add the handler to your logger as follow ::\r\n\r\n\r\n\r\n```python\r\n\r\nfrom cmreslogging.handlers import CMRESHandler\r\n\r\n\r\n\r\nhandler = CMRESHandler(hosts=[{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],\r\n\r\n auth_type=CMRESHandler.AuthType.NO_AUTH,\r\n\r\n es_index_name=\"my_python_index\")\r\n\r\nlog = logging.getLogger(\"PythonTest\")\r\n\r\nlog.setLevel(logging.INFO)\r\n\r\nlog.addHandler(handler)\r\n\r\n```\r\n\r\n\r\n\r\nYou can add fields upon initialization, providing more data of the execution context ::\r\n\r\n\r\n\r\n```python\r\n\r\nfrom cmreslogging.handlers import CMRESHandler\r\n\r\n\r\n\r\nhandler = CMRESHandler(hosts=[{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],\r\n\r\n auth_type=CMRESHandler.AuthType.NO_AUTH,\r\n\r\n es_index_name=\"my_python_index\",\r\n\r\n es_additional_fields={'App': 'MyAppName', 'Environment': 'Dev'})\r\n\r\nlog = logging.getLogger(\"PythonTest\")\r\n\r\nlog.setLevel(logging.INFO)\r\n\r\nlog.addHandler(handler)\r\n\r\n```\r\n\r\n\r\n\r\nThis additional fields will be applied to all logging fields and recorded in elasticsearch\r\n\r\n\r\n\r\nTo log, use the regular commands from the logging library ::\r\n\r\n\r\n\r\n log.info(\"This is an info statement that will be logged into elasticsearch\")\r\n\r\n\r\n\r\nYour code can also dump additional extra fields on a per log basis that can be used to instrument\r\n\r\noperations. For example, when reading information from a database you could do something like::\r\n\r\n\r\n\r\n start_time = time.time()\r\n\r\n database_operation()\r\n\r\n db_delta = time.time() - start_time\r\n\r\n log.debug(\"DB operation took %.3f seconds\" % db_delta, extra={'db_execution_time': db_delta})\r\n\r\n\r\n\r\nThe code above executes the DB operation, measures the time it took and logs an entry that contains\r\n\r\nin the message the time the operation took as string and for convenience, it creates another field\r\n\r\ncalled db_execution_time with a float that can be used to plot the time this operations are taking using\r\n\r\nKibana on top of elasticsearch\r\n\r\n\r\n\r\n# Initialization parameters\r\n\r\n\r\n\r\nThe constructors takes the following parameters:\r\n\r\n - hosts: The list of hosts that elasticsearch clients will connect, multiple hosts are allowed.\r\n\r\n Use ```'scheme'``` to determinate if use SSL (`use_ssl` is deprecated). To use SSL set ```'scheme': 'https'```, or if you don't need SSL Sset ```'scheme': 'http'```.\r\n\r\n for example::\r\n\r\n\r\n\r\n [{'host':'host1','port':9200, 'scheme': 'https'}, {'host':'host2','port':9200, 'scheme': 'http'}]\r\n\r\n\r\n\r\n\r\n\r\n - auth_type: The authentication currently support CMRESHandler.AuthType = NO_AUTH, BASIC_AUTH, KERBEROS_AUTH, NTLM_AUTH\r\n\r\n - auth_details: When ``CMRESHandler.AuthType.BASIC_AUTH`` or ``CMRESHandler.AuthType.NTLM_AUTH`` is used this argument must contain a tuple of string with the user and password that will be used to authenticate against the Elasticsearch servers, for example ('User','Password')\r\n\r\n - aws_access_key: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS key id of the the AWS IAM user\r\n\r\n - aws_secret_key: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS secret key of the the AWS IAM user\r\n\r\n - aws_region: When ``CMRESHandler.AuthType.AWS_SIGNED_AUTH`` is used this argument must contain the AWS region of the the AWS Elasticsearch servers, for example ``'us-east'``\r\n\r\n - verify_ssl: A boolean that defines if the SSL certificates are validated or not\r\n\r\n - buffer_size: An int, Once this size is reached on the internal buffer results are flushed into ES\r\n\r\n - flush_frequency_in_sec: A float representing how often and when the buffer will be flushed\r\n\r\n - es_index_name: A string with the prefix of the elasticsearch index that will be created. Note a date with\r\n\r\n YYYY.MM.dd, ``python_logger`` used by default\r\n\r\n - index_name_frequency: The frequency to use as part of the index naming. Currently supports\r\n\r\n CMRESHandler.IndexNameFrequency.DAILY, CMRESHandler.IndexNameFrequency.WEEKLY,\r\n\r\n CMRESHandler.IndexNameFrequency.MONTHLY, CMRESHandler.IndexNameFrequency.YEARLY by default the daily rotation\r\n\r\n is used\r\n\r\n - es_additional_fields: A dictionary with all the additional fields that you would like to add to the logs\r\n\r\n\r\n\r\n# Django Integration\r\n\r\n\r\n\r\nIt is also very easy to integrate the handler to `Django <https://www.djangoproject.com/>`_ And what is even\r\n\r\nbetter, at DEBUG level django logs information such as how long it takes for DB connections to return so\r\n\r\nthey can be plotted on Kibana, or the SQL statements that Django executed. \r\n\r\n\r\n\r\n```python\r\n\r\nfrom cmreslogging.handlers import CMRESHandler\r\n\r\n\r\n\r\nLOGGING = {\r\n\r\n 'version': 1,\r\n\r\n 'disable_existing_loggers': False,\r\n\r\n 'handlers': {\r\n\r\n 'file': {\r\n\r\n 'level': 'DEBUG',\r\n\r\n 'class': 'logging.handlers.RotatingFileHandler',\r\n\r\n 'filename': './debug.log',\r\n\r\n 'maxBytes': 102400,\r\n\r\n 'backupCount': 5,\r\n\r\n },\r\n\r\n 'elasticsearch': {\r\n\r\n 'level': 'DEBUG',\r\n\r\n 'class': 'cmreslogging.handlers.CMRESHandler',\r\n\r\n 'hosts': [{'host': 'localhost', 'port': 9200, 'scheme': 'http'}],\r\n\r\n 'es_index_name': 'my_python_app',\r\n\r\n 'es_additional_fields': {'App': 'Test', 'Environment': 'Dev'},\r\n\r\n 'auth_type': CMRESHandler.AuthType.NO_AUTH,\r\n\r\n 'use_ssl': False,\r\n\r\n },\r\n\r\n },\r\n\r\n 'loggers': {\r\n\r\n 'django': {\r\n\r\n 'handlers': ['file','elasticsearch'],\r\n\r\n 'level': 'DEBUG',\r\n\r\n 'propagate': True,\r\n\r\n },\r\n\r\n },\r\n\r\n}\r\n\r\n```\r\n\r\n\r\n\r\nThere is more information about how Django logging works in the\r\n\r\n`Django documentation <https://docs.djangoproject.com/en/1.9/topics/logging//>`_\r\n\r\n\r\n\r\n\r\n\r\n# Building the sources & Testing\r\n\r\n\r\n\r\nTo create the package follow the standard python setup.py to compile.\r\n\r\nTo test, just execute the python tests within the test folder\r\n\r\n\r\n\r\n# Why using an appender rather than logstash or beats\r\n\r\n\r\n\r\nIn some cases is quite useful to provide all the information available within the LogRecords as it contains\r\n\r\nthings such as exception information, the method, file, log line where the log was generated.\r\n\r\n\r\n\r\nIf you are interested on understanding more about the differences between the agent vs handler\r\n\r\napproach, I'd suggest reading `this conversation thread <https://github.com/cmanaha/python-elasticsearch-logger/issues/44/>`_\r\n\r\n\r\n\r\nThe same functionality can be implemented in many other different ways. For example, consider the integration\r\n\r\nusing `SysLogHandler <https://docs.python.org/3/library/logging.handlers.html#sysloghandler>`_ and\r\n\r\n`logstash syslog plugin <https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html>`_.\r\n\r\n\r\n\r\n\r\n\r\n# Contributing back\r\n\r\n\r\n\r\nFeel free to use this as is or even better, feel free to fork and send your pull requests over.\r\n",
"bugtrack_url": null,
"license": "Apache2",
"summary": "Elasticsearch Log handler for the logging library",
"version": "1.1.0",
"project_urls": {
"Homepage": "https://github.com/thoscilo/python-elasticsearch-logger/tree/release"
},
"split_keywords": [
"logging",
"elasticsearch",
"handler",
"log",
"django",
"instrumentation"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5014f103f339d605808fda53b3ef3f0a3d78c9fdd676ba5632bb1f8ec4b28744",
"md5": "9e28f2fe504224aaac5e6ad2ea849ca6",
"sha256": "514ee4c3074cacb2c84d049d4e725cd932d750684d87ace155c82026ed6f927d"
},
"downloads": -1,
"filename": "CMRESHandler_NtlmAuth-1.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9e28f2fe504224aaac5e6ad2ea849ca6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 12515,
"upload_time": "2024-04-17T18:35:58",
"upload_time_iso_8601": "2024-04-17T18:35:58.815360Z",
"url": "https://files.pythonhosted.org/packages/50/14/f103f339d605808fda53b3ef3f0a3d78c9fdd676ba5632bb1f8ec4b28744/CMRESHandler_NtlmAuth-1.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1f786e1e044f5992f25a8f7dda66ad2e8bbc653417303d14bbde902ba4fa1018",
"md5": "727d94655c48c3570223d6e9dc637a97",
"sha256": "0ab0a6ccfb46c4347e3ded12fc4f057ab56b9fc57e4c2a519e545cbf8131e64e"
},
"downloads": -1,
"filename": "CMRESHandler-NtlmAuth-1.1.0.tar.gz",
"has_sig": false,
"md5_digest": "727d94655c48c3570223d6e9dc637a97",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 12440,
"upload_time": "2024-04-17T18:36:02",
"upload_time_iso_8601": "2024-04-17T18:36:02.673422Z",
"url": "https://files.pythonhosted.org/packages/1f/78/6e1e044f5992f25a8f7dda66ad2e8bbc653417303d14bbde902ba4fa1018/CMRESHandler-NtlmAuth-1.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-17 18:36:02",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "thoscilo",
"github_project": "python-elasticsearch-logger",
"travis_ci": true,
"coveralls": false,
"github_actions": false,
"tox": true,
"lcname": "cmreshandler-ntlmauth"
}