Logparser

June 10, 2025 · View on GitHub

Logparser

Python version Pypi version Pypi version Downloads License

Logparser provides a machine learning toolkit and benchmarks for automated log parsing, which is a crucial step for structured log analytics. By applying logparser, users can automatically extract event templates from unstructured logs and convert raw log messages into a sequence of structured events. The process of log parsing is also known as message template extraction, log key extraction, or log message clustering in the literature.


An example of log parsing

🌈 New updates

  • Since the first release of logparser, many PRs and issues have been submitted due to incompatibility with Python 3. Finally, we update logparser v1.0.0 with support for Python 3. Thanks for all the contributions (#PR86, #PR85, #PR83, #PR80, #PR65, #PR57, #PR53, #PR52, #PR51, #PR49, #PR18, #PR22)!
  • We build the package wheel logparser3 and release it on pypi. Please install via pip install logparser3.
  • We refactor the code structure and beautify the code via the Python code formatter black.

Log parsers available:

PublicationParserPaper TitleBenchmark
IPOM'03SLCTA Data Clustering Algorithm for Mining Patterns from Event Logs, by Risto Vaarandi.:arrow_upper_right:
QSIC'08AELAbstracting Execution Logs to Execution Events for Enterprise Applications, by Zhen Ming Jiang, Ahmed E. Hassan, Parminder Flora, Gilbert Hamann.:arrow_upper_right:
KDD'09IPLoMClustering Event Logs Using Iterative Partitioning, by Adetokunbo Makanju, A. Nur Zincir-Heywood, Evangelos E. Milios.:arrow_upper_right:
ICDM'09LKEExecution Anomaly Detection in Distributed Systems through Unstructured Log Analysis, by Qiang Fu, Jian-Guang Lou, Yi Wang, Jiang Li. [Microsoft]:arrow_upper_right:
MSR'10LFAAbstracting Log Lines to Log Event Types for Mining Software System Logs, by Meiyappan Nagappan, Mladen A. Vouk.:arrow_upper_right:
CIKM'11LogSigLogSig: Generating System Events from Raw Textual Logs, by Liang Tang, Tao Li, Chang-Shing Perng.:arrow_upper_right:
SCC'13SHISOIncremental Mining of System Log Format, by Masayoshi Mizutani.:arrow_upper_right:
CNSM'15LogClusterLogCluster - A Data Clustering and Pattern Mining Algorithm for Event Logs, by Risto Vaarandi, Mauno Pihelgas.:arrow_upper_right:
CNSM'15LenMaLength Matters: Clustering System Log Messages using Length of Words, by Keiichi Shima.:arrow_upper_right:
CIKM'16LogMineLogMine: Fast Pattern Recognition for Log Analytics, by Hossein Hamooni, Biplob Debnath, Jianwu Xu, Hui Zhang, Geoff Jiang, Adbullah Mueen. [NEC]:arrow_upper_right:
ICDM'16SpellSpell: Streaming Parsing of System Event Logs, by Min Du, Feifei Li.:arrow_upper_right:
ICWS'17DrainDrain: An Online Log Parsing Approach with Fixed Depth Tree, by Pinjia He, Jieming Zhu, Zibin Zheng, and Michael R. Lyu.:arrow_upper_right:
ICPC'18MoLFIA Search-based Approach for Accurate Identification of Log Message Formats, by Salma Messaoudi, Annibale Panichella, Domenico Bianculli, Lionel Briand, Raimondas Sasnauskas.:arrow_upper_right:
TSE'20LogramLogram: Efficient Log Parsing Using n-Gram Dictionaries, by Hetong Dai, Heng Li, Che-Shao Chen, Weiyi Shang, and Tse-Hsun (Peter) Chen.:arrow_upper_right:
ECML-PKDD'20NuLogSelf-Supervised Log Parsing, by Sasho Nedelkoski, Jasmin Bogatinovski, Alexander Acker, Jorge Cardoso, Odej Kao.:arrow_upper_right:
ICSME'22ULPAn Effective Approach for Parsing Large Log Files, by Issam Sedki, Abdelwahab Hamou-Lhadj, Otmane Ait-Mohamed, Mohammed A. Shehab.:arrow_upper_right:
TSC'23BrainBrain: Log Parsing with Bidirectional Parallel Tree, by Siyu Yu, Pinjia He, Ningjiang Chen, Yifan Wu.:arrow_upper_right:
ICSE'24DivLogDivLog: Log Parsing with Prompt Enhanced In-Context Learning, by Junjielong Xu, Ruichun Yang, Yintong Huo, Chengyu Zhang, and Pinjia He.:arrow_upper_right:

:bulb: Welcome to submit a PR to push your parser code to logparser and add your paper to the table.

Installation

We recommend installing the logparser package and requirements via pip install.

pip install logparser3

In particular, the package depends on the following requirements. Note that regex matching in Python is brittle, so we recommend fixing the regex library to version 2022.3.2.

Note: If you encouter "Error: need to escape...", please follow the instructions here.

  • python 3.6+
  • regex 2022.3.2
  • numpy
  • pandas
  • scipy
  • scikit-learn

Conditional requirements:

  • If using MoLFI: deap
  • If using SHISO: nltk
  • If using SLCT: gcc
  • If using LogCluster: perl
  • If using NuLog: torch, torchvision, keras_preprocessing
  • If using DivLog: openai, tiktoken (require python 3.8+)

Get started

  1. Run the demo:

    For each log parser, we provide a demo to help you get started. Each demo shows the basic usage of a target log parser and the hyper-parameters to configure. For example, the following command shows how to run the demo for Drain.

    cd logparser/Drain
    python demo.py
    
  2. Run the benchmark:

    For each log parser, we provide a benchmark script to run log parsing on the loghub_2k datasets for evaluating parsing accuarcy. You can also use other benchmark datasets for log parsing.

    cd logparser/Drain 
    python benchmark.py
    

    The benchmarking results can be found at the readme file of each parser, e.g., https://github.com/logpai/logparser/tree/main/logparser/Drain#benchmark.

  3. Parse your own logs:

    It is easy to apply logparser to parsing your own log data. To do so, you need to install the logparser3 package first. Then you can develop your own script following the below code snippet to start log parsing. See the full example code at example/parse_your_own_logs.py.

    from logparser.Drain import LogParser
    
    input_dir = 'PATH_TO_LOGS/' # The input directory of log file
    output_dir = 'result/'  # The output directory of parsing results
    log_file = 'unknow.log'  # The input log file name
    log_format = '<Date> <Time> <Level>:<Content>' # Define log format to split message fields
    # Regular expression list for optional preprocessing (default: [])
    regex = [
        r'(/|)([0-9]+\.){3}[0-9]+(:[0-9]+|)(:|)' # IP
    ]
    st = 0.5  # Similarity threshold
    depth = 4  # Depth of all leaf nodes
    
    parser = LogParser(log_format, indir=input_dir, outdir=output_dir,  depth=depth, st=st, rex=regex)
    parser.parse(log_file)
    

    After running logparser, you can obtain extracted event templates and parsed structured logs in the output folder.

    • *_templates.csv (See example HDFS_2k.log_templates.csv)

      EventIdEventTemplateOccurrences
      dc2c74b7PacketResponder <> for block <> terminating311
      e3df2680Received block <> of size <> from <*>292
      09a53393Receiving block <> src: <> dest: <*>292
    • *_structured.csv (See example HDFS_2k.log_structured.csv)

      ...LevelContentEventIdEventTemplateParameterList
      ...INFOPacketResponder 1 for block blk_38865049064139660 terminatingdc2c74b7PacketResponder <> for block <> terminating['1', 'blk_38865049064139660']
      ...INFOReceived block blk_3587508140051953248 of size 67108864 from /10.251.42.84e3df2680Received block <> of size <> from <*>['blk_3587508140051953248', '67108864', '/10.251.42.84']
      ...INFOVerification succeeded for blk_-498091651989428962932777b38Verification succeeded for <*>['blk_-4980916519894289629']

Production use

The main goal of logparser is used for research and benchmark purpose. Researchers can use logparser as a code base to develop new log parsers while practitioners could assess the performance and scalability of current log parsing methods through our benchmarking. We strongly recommend practitioners to try logparser in your production environment. But be aware that the current implementation of logparser is far from ready for production use. Whereas we currently have no plan to do that, we do have a few suggestions for developers who want to build an intelligent production-level log parser.

  • Please be aware of the licenses of third-party libraries used in logparser. We suggest to keep one parser and delete the others and then re-build the package wheel. This would not break the use of logparser.
  • Please enhance logparser with efficiency and scalability with multi-processing, add failure recovery, add persistence to disk or message queue Kafka.
  • Drain3 provides a good example for your reference that is built with practical enhancements for production scenarios.

🔥 Citation

If you use our logparser tools or benchmarking results in your publication, please cite the following papers.

🤗 Contributors

zhujiem
Zhujiem
PinjiaHe
Pinjia He
JinYang88
LIU, Jinyang
Siyuexi
Junjielong Xu
ShilinHe
Shilin HE
JosephMeghanathD
Joseph
jcordon5
José A. Cordón
rustamtemirov
Rustam Temirov
gaiusyu
Siyu Yu (Youth Yu)
thomasryck
Thomas Ryckeboer
IsuruBoyagane15
Isuru Boyagane

Discussion

Welcome to join our WeChat group for any question and discussion. Alternatively, you can open an issue here.

Scan QR code