nb-log-file-handler


Namenb-log-file-handler JSON
Version 0.2 PyPI version JSON
download
home_pagehttps://github.com/ydf0509/nb_log_file_handler
Summarymulti process safe log file handler,both time and size rotate,benchmark fast than concurrent_log_handler 100 times
upload_time2023-07-09 14:34:57
maintainerydf
docs_urlNone
authorbfzs
requires_python
licenseBSD License
keywords logging logger multiprocess file handler
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # nb_log_file_handler

multi process safe log file handler,both time and size rotate,benchmark fast than concurrent_log_handler 100 times


nb_log_file_handler 是多进程安全切割,同时按时间和大小切割的FileHandler,性能远超 concurrent_log_handler.ConcurrentRotatingFileHandler


## 安装

pip install nb_log_file_handler

## nb_log_file_handler 实现原理,

nb_log_file_handler 在win上采用每隔0.1秒批量写入文件,atexit钩子对程序即将结束后的剩余待写入的消息写到文件中。
linux的文件io性能本身比较好,加上fork 子进程不支持 atexit 触发执行,所以linux上使用单个消息就写入。

nb_log_file_handler 性能远超 concurrent_log_handler.ConcurrentRotatingFileHandler

## 1、nb_log_file_handler使用方式:

代码如下,和filehandler用法相同,导入 NbLogFileHandler

```python
import multiprocessing
import logging
import time
from nb_log_file_handler import NbLogFileHandler

logger = logging.getLogger('hello')

fh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)
```


## 2、各种按文件/时间大小切割的fileHander对比,

为了测试多进程按文件大小切割安全的复现,所以所有maxBytes按照1000*1000字节,即1M进行切割。


### 2.1、对比logging内置的 logging.handlers.RotatingFileHandler

logging.handlers.RotatingFileHandler 多进程按大小切割完全不可行,切割时候疯狂报错

```python

import multiprocessing
import logging.handlers
import time

logger = logging.getLogger('hello')

fh = logging.handlers.RotatingFileHandler('xx4.log',maxBytes=1000*1000,backupCount=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)
```

这个代码使用 文件handler选择原生自带的 logging.handlers.RotatingFileHandler
会疯狂报错,因为进程a在达到大小切割改名日志文件时候,进程b并不知情,报错如下:
```
PermissionError: [WinError 32] 另一个程序正在使用此文件,进程无法访问。: 'D:\\codes\\nb_log_file_handler\\tests_nb_log_file_handler\\xx4.log' -> 'D:\\codes\\nb_log_file_handler\\tests_nb_log_file_handler\\xx4.log.1'
```

所以一般多进程写入同一个日志文件,并支持切割,那么久不能使用logging自带的RotatingFileHandler,要使用第三方包的filehandler。

### 2.2、对比小有名气的多进程切割安全的三方包 concurrent_log_handler

concurrent_log_handler.ConcurrentRotatingFileHandler

```python


import multiprocessing
import logging
import time
from concurrent_log_handler import ConcurrentRotatingFileHandler

logger = logging.getLogger('hello')

fh = ConcurrentRotatingFileHandler('xx2.log',maxBytes=1000*1000,backupCount=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)
```

concurrent_log_handler这个包在windows上性能无法忍受,10进程写入10000次需要263秒,性能惨不忍睹。这个包在linux上性能还可以接受。

### 2.3、 nb_log_file_handler.NbLogFileHandler 按时间和大小多进程安全切割,性能远远的暴击 concurrent_log_handler

```python


import multiprocessing
import logging
import time
from nb_log_file_handler import NbLogFileHandler

logger = logging.getLogger('hello')

fh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)

logger.addHandler(fh)
# logger.addHandler(logging.StreamHandler())
logger.setLevel(logging.DEBUG)



def f():
    for i in range(10000):
        logger.info(f'{i}aaaaa'*20)

if __name__ == '__main__':
    t1 = time.time()
    ps = []
    for  j in range(10):
        p = multiprocessing.Process(target=f)
        ps.append(p)
        p.start()
    for p in ps:
        p.join()
    print(time.time()-t1)
```

nb_log_file_handler.NbLogFileHandler 10进程写入10000次只需要1.3秒,nb_log_file_handler 性能远远的暴击三方包 concurrent_log_handler

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ydf0509/nb_log_file_handler",
    "name": "nb-log-file-handler",
    "maintainer": "ydf",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "ydf0509@sohu.com",
    "keywords": "logging,logger,multiprocess file handler",
    "author": "bfzs",
    "author_email": "ydf0509@sohu.com",
    "download_url": "https://files.pythonhosted.org/packages/2c/93/ee7ab21e97415d663809c5e1ef7811516e6ab3cf3728c77ecf219f420d0f/nb_log_file_handler-0.2.tar.gz",
    "platform": "all",
    "description": "# nb_log_file_handler\r\n\r\nmulti process safe log file handler,both time and size rotate\uff0cbenchmark fast than concurrent_log_handler 100 times\r\n\r\n\r\nnb_log_file_handler \u662f\u591a\u8fdb\u7a0b\u5b89\u5168\u5207\u5272\uff0c\u540c\u65f6\u6309\u65f6\u95f4\u548c\u5927\u5c0f\u5207\u5272\u7684FileHandler,\u6027\u80fd\u8fdc\u8d85 concurrent_log_handler.ConcurrentRotatingFileHandler\r\n\r\n\r\n## \u5b89\u88c5\r\n\r\npip install nb_log_file_handler\r\n\r\n## nb_log_file_handler \u5b9e\u73b0\u539f\u7406\uff0c\r\n\r\nnb_log_file_handler \u5728win\u4e0a\u91c7\u7528\u6bcf\u96940.1\u79d2\u6279\u91cf\u5199\u5165\u6587\u4ef6\uff0catexit\u94a9\u5b50\u5bf9\u7a0b\u5e8f\u5373\u5c06\u7ed3\u675f\u540e\u7684\u5269\u4f59\u5f85\u5199\u5165\u7684\u6d88\u606f\u5199\u5230\u6587\u4ef6\u4e2d\u3002\r\nlinux\u7684\u6587\u4ef6io\u6027\u80fd\u672c\u8eab\u6bd4\u8f83\u597d\uff0c\u52a0\u4e0afork \u5b50\u8fdb\u7a0b\u4e0d\u652f\u6301 atexit \u89e6\u53d1\u6267\u884c\uff0c\u6240\u4ee5linux\u4e0a\u4f7f\u7528\u5355\u4e2a\u6d88\u606f\u5c31\u5199\u5165\u3002\r\n\r\nnb_log_file_handler \u6027\u80fd\u8fdc\u8d85 concurrent_log_handler.ConcurrentRotatingFileHandler\r\n\r\n## 1\u3001nb_log_file_handler\u4f7f\u7528\u65b9\u5f0f\uff1a\r\n\r\n\u4ee3\u7801\u5982\u4e0b\uff0c\u548cfilehandler\u7528\u6cd5\u76f8\u540c\uff0c\u5bfc\u5165 NbLogFileHandler\r\n\r\n```python\r\nimport multiprocessing\r\nimport logging\r\nimport time\r\nfrom nb_log_file_handler import NbLogFileHandler\r\n\r\nlogger = logging.getLogger('hello')\r\n\r\nfh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)\r\n\r\nlogger.addHandler(fh)\r\n# logger.addHandler(logging.StreamHandler())\r\nlogger.setLevel(logging.DEBUG)\r\n\r\n\r\n\r\ndef f():\r\n    for i in range(10000):\r\n        logger.info(f'{i}aaaaa'*20)\r\n```\r\n\r\n\r\n## 2\u3001\u5404\u79cd\u6309\u6587\u4ef6/\u65f6\u95f4\u5927\u5c0f\u5207\u5272\u7684fileHander\u5bf9\u6bd4\uff0c\r\n\r\n\u4e3a\u4e86\u6d4b\u8bd5\u591a\u8fdb\u7a0b\u6309\u6587\u4ef6\u5927\u5c0f\u5207\u5272\u5b89\u5168\u7684\u590d\u73b0\uff0c\u6240\u4ee5\u6240\u6709maxBytes\u6309\u71671000*1000\u5b57\u8282\uff0c\u53731M\u8fdb\u884c\u5207\u5272\u3002\r\n\r\n\r\n### 2.1\u3001\u5bf9\u6bd4logging\u5185\u7f6e\u7684 logging.handlers.RotatingFileHandler\r\n\r\nlogging.handlers.RotatingFileHandler \u591a\u8fdb\u7a0b\u6309\u5927\u5c0f\u5207\u5272\u5b8c\u5168\u4e0d\u53ef\u884c\uff0c\u5207\u5272\u65f6\u5019\u75af\u72c2\u62a5\u9519\r\n\r\n```python\r\n\r\nimport multiprocessing\r\nimport logging.handlers\r\nimport time\r\n\r\nlogger = logging.getLogger('hello')\r\n\r\nfh = logging.handlers.RotatingFileHandler('xx4.log',maxBytes=1000*1000,backupCount=3)\r\n\r\nlogger.addHandler(fh)\r\n# logger.addHandler(logging.StreamHandler())\r\nlogger.setLevel(logging.DEBUG)\r\n\r\n\r\n\r\ndef f():\r\n    for i in range(10000):\r\n        logger.info(f'{i}aaaaa'*20)\r\n\r\nif __name__ == '__main__':\r\n    t1 = time.time()\r\n    ps = []\r\n    for  j in range(10):\r\n        p = multiprocessing.Process(target=f)\r\n        ps.append(p)\r\n        p.start()\r\n    for p in ps:\r\n        p.join()\r\n    print(time.time()-t1)\r\n```\r\n\r\n\u8fd9\u4e2a\u4ee3\u7801\u4f7f\u7528 \u6587\u4ef6handler\u9009\u62e9\u539f\u751f\u81ea\u5e26\u7684 logging.handlers.RotatingFileHandler\r\n\u4f1a\u75af\u72c2\u62a5\u9519\uff0c\u56e0\u4e3a\u8fdb\u7a0ba\u5728\u8fbe\u5230\u5927\u5c0f\u5207\u5272\u6539\u540d\u65e5\u5fd7\u6587\u4ef6\u65f6\u5019\uff0c\u8fdb\u7a0bb\u5e76\u4e0d\u77e5\u60c5\uff0c\u62a5\u9519\u5982\u4e0b\uff1a\r\n```\r\nPermissionError: [WinError 32] \u53e6\u4e00\u4e2a\u7a0b\u5e8f\u6b63\u5728\u4f7f\u7528\u6b64\u6587\u4ef6\uff0c\u8fdb\u7a0b\u65e0\u6cd5\u8bbf\u95ee\u3002: 'D:\\\\codes\\\\nb_log_file_handler\\\\tests_nb_log_file_handler\\\\xx4.log' -> 'D:\\\\codes\\\\nb_log_file_handler\\\\tests_nb_log_file_handler\\\\xx4.log.1'\r\n```\r\n\r\n\u6240\u4ee5\u4e00\u822c\u591a\u8fdb\u7a0b\u5199\u5165\u540c\u4e00\u4e2a\u65e5\u5fd7\u6587\u4ef6\uff0c\u5e76\u652f\u6301\u5207\u5272\uff0c\u90a3\u4e48\u4e45\u4e0d\u80fd\u4f7f\u7528logging\u81ea\u5e26\u7684RotatingFileHandler\uff0c\u8981\u4f7f\u7528\u7b2c\u4e09\u65b9\u5305\u7684filehandler\u3002\r\n\r\n### 2.2\u3001\u5bf9\u6bd4\u5c0f\u6709\u540d\u6c14\u7684\u591a\u8fdb\u7a0b\u5207\u5272\u5b89\u5168\u7684\u4e09\u65b9\u5305 concurrent_log_handler\r\n\r\nconcurrent_log_handler.ConcurrentRotatingFileHandler\r\n\r\n```python\r\n\r\n\r\nimport multiprocessing\r\nimport logging\r\nimport time\r\nfrom concurrent_log_handler import ConcurrentRotatingFileHandler\r\n\r\nlogger = logging.getLogger('hello')\r\n\r\nfh = ConcurrentRotatingFileHandler('xx2.log',maxBytes=1000*1000,backupCount=3)\r\n\r\nlogger.addHandler(fh)\r\n# logger.addHandler(logging.StreamHandler())\r\nlogger.setLevel(logging.DEBUG)\r\n\r\n\r\n\r\ndef f():\r\n    for i in range(10000):\r\n        logger.info(f'{i}aaaaa'*20)\r\n\r\nif __name__ == '__main__':\r\n    t1 = time.time()\r\n    ps = []\r\n    for  j in range(10):\r\n        p = multiprocessing.Process(target=f)\r\n        ps.append(p)\r\n        p.start()\r\n    for p in ps:\r\n        p.join()\r\n    print(time.time()-t1)\r\n```\r\n\r\nconcurrent_log_handler\u8fd9\u4e2a\u5305\u5728windows\u4e0a\u6027\u80fd\u65e0\u6cd5\u5fcd\u53d7\uff0c10\u8fdb\u7a0b\u5199\u516510000\u6b21\u9700\u8981263\u79d2\uff0c\u6027\u80fd\u60e8\u4e0d\u5fcd\u7779\u3002\u8fd9\u4e2a\u5305\u5728linux\u4e0a\u6027\u80fd\u8fd8\u53ef\u4ee5\u63a5\u53d7\u3002\r\n\r\n### 2.3\u3001 nb_log_file_handler.NbLogFileHandler \u6309\u65f6\u95f4\u548c\u5927\u5c0f\u591a\u8fdb\u7a0b\u5b89\u5168\u5207\u5272\uff0c\u6027\u80fd\u8fdc\u8fdc\u7684\u66b4\u51fb concurrent_log_handler\r\n\r\n```python\r\n\r\n\r\nimport multiprocessing\r\nimport logging\r\nimport time\r\nfrom nb_log_file_handler import NbLogFileHandler\r\n\r\nlogger = logging.getLogger('hello')\r\n\r\nfh = NbLogFileHandler(file_name='xx3.log',log_path='./',max_bytes=1000*1000,back_count=3)\r\n\r\nlogger.addHandler(fh)\r\n# logger.addHandler(logging.StreamHandler())\r\nlogger.setLevel(logging.DEBUG)\r\n\r\n\r\n\r\ndef f():\r\n    for i in range(10000):\r\n        logger.info(f'{i}aaaaa'*20)\r\n\r\nif __name__ == '__main__':\r\n    t1 = time.time()\r\n    ps = []\r\n    for  j in range(10):\r\n        p = multiprocessing.Process(target=f)\r\n        ps.append(p)\r\n        p.start()\r\n    for p in ps:\r\n        p.join()\r\n    print(time.time()-t1)\r\n```\r\n\r\nnb_log_file_handler.NbLogFileHandler 10\u8fdb\u7a0b\u5199\u516510000\u6b21\u53ea\u9700\u89811.3\u79d2\uff0cnb_log_file_handler \u6027\u80fd\u8fdc\u8fdc\u7684\u66b4\u51fb\u4e09\u65b9\u5305 concurrent_log_handler\r\n",
    "bugtrack_url": null,
    "license": "BSD License",
    "summary": "multi process safe log file handler,both time and size rotate\uff0cbenchmark fast than concurrent_log_handler 100 times",
    "version": "0.2",
    "project_urls": {
        "Homepage": "https://github.com/ydf0509/nb_log_file_handler"
    },
    "split_keywords": [
        "logging",
        "logger",
        "multiprocess file handler"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2c93ee7ab21e97415d663809c5e1ef7811516e6ab3cf3728c77ecf219f420d0f",
                "md5": "363b912c84aeabda00075a423201f02f",
                "sha256": "9b6a0a2a06b821e8500e485ac34d2ed53171108ef0eb8a43a4c7f86c03591cd8"
            },
            "downloads": -1,
            "filename": "nb_log_file_handler-0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "363b912c84aeabda00075a423201f02f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 5226,
            "upload_time": "2023-07-09T14:34:57",
            "upload_time_iso_8601": "2023-07-09T14:34:57.432884Z",
            "url": "https://files.pythonhosted.org/packages/2c/93/ee7ab21e97415d663809c5e1ef7811516e6ab3cf3728c77ecf219f420d0f/nb_log_file_handler-0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-07-09 14:34:57",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ydf0509",
    "github_project": "nb_log_file_handler",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "nb-log-file-handler"
}
        
Elapsed time: 0.10629s