Thanks to visit codestin.com
Credit goes to github.com

Skip to content

diegojromerolopez/backgroundlog

Repository files navigation

backgroundlog

Thread-based log handler for better performance

test License Maintenance made-with-python Code style: black Imports: isort Checked with mypy PyPI pyversions PyPI version backgroundlog PyPI status PyPI download month

Do not have your python program slowed down by your logging.

Introduction

Most of the time we log on disk, and all I/O is time consuming. By leveraging a thread, it is possible to speed up your python application considerably.

Use

Default use

import logging
from backgroundlog.handlers.thread_handler import ThreadHandler

# Setting up the logging thread handler
file_handler = logging.FileHandler('/var/log/myapp.log', mode="a", encoding="utf-8")
thread_handler = ThreadHandler(file_handler)

# Creating a new logger
bg_logger = logging.getLogger('bg_logger')
bg_logger.setLevel(logging.INFO)

# Adding the thread handler
bg_logger.addHandler(thread_handler)

# Using the logger
bg_logger.info('This is a log message')

Options

Set a queue size

from backgroundlog.handlers.thread_handler import ThreadHandler

thread_handler = ThreadHandler(file_handler, queue_size=5000)

By default, the queue size is 1000.

Set a blocking policy by logging record levels

When putting the records in the queue, it could reach the queue size. We provide a way to deal with this issue: set a blocking policy by logging record level, and in the case of a non-blocking policy, the record will be discarded and we will increment a dropped log record.

Only info, error and critical records are blocking:
from backgroundlog.handlers.thread_handler import ThreadHandler
from logging import CRITICAL, ERROR, INFO

thread_handler = ThreadHandler(file_handler, blocking_levels={INFO, ERROR, CRITICAL})
Only error and critical records are blocking
from backgroundlog.handlers.thread_handler import ThreadHandler
from logging import CRITICAL, ERROR

thread_handler = ThreadHandler(file_handler, blocking_levels={ERROR, CRITICAL})
Only critical records are blocking
from backgroundlog.handlers.thread_handler import ThreadHandler
from logging import CRITICAL

thread_handler = ThreadHandler(file_handler, blocking_levels={CRITICAL})
No records are blocking
from backgroundlog.handlers.thread_handler import ThreadHandler

thread_handler = ThreadHandler(file_handler, blocking_levels=None)

By default, the error and critical records are blocking, the rest are not.

Performance testing

We have done several local testing with different logging handlers. See the file run_performance_comparison.py for a full catalog of the performance tests we run.

We used two versions of Python depending on if they had or not the global interpret lock:

  • Python 3.15.3 (with GIL)
  • Python 3.15.3t (without GIL)

All tests are 100_000 iterations of creating the same logging message, and were run in a Macbook Pro M1 with 16 GB of RAM.

Python 3.15.3 (with GIL)

Logging Handler Spent Time vs. Baseline
Mean Time (ms) Std Dev (ms)
StreamHandler 0.685 0.006 baseline
FileHandler 0.685 0.018 +0.03%
ThreadHandler (StreamHandler) 0.487 0.03 -28.911%
ThreadHandler (FileHandler) 0.475 0.002 -30.66%

There is a ~30% of improvement when running the thread handler.

Python 3.15.3t (without GIL)

Logging Handler Spent Time vs. Baseline
Mean Time (ms) Std Dev (ms)
StreamHandler 0.539 0.004 baseline
FileHandler 0.545 0.013 +1.109%
ThreadHandler (StreamHandler) 0.344 0.002 -36.301%
ThreadHandler (FileHandler) 0.339 0.001 -37.118%

There is a ~36% of improvement when running the thread handler. +6% with respect of the Python version with GIL.

Conclusions

Not blocking the main flow of your program by making use of backgroundlog gives you a 30% speed gain.

Dependencies

This package has no dependencies.

Python version support

Minimum version support is 3.10.

License

MIT license, but if you need any other contact me.