iLogtail¶
Installation and Configuration¶
Enable DataKit Collector¶
cd /usr/local/datakit/conf.d/samples
cp logstreaming.conf.sample logstreaming.conf
sysemctl restart datakit
iLogtail Configuration¶
Download iLogtail¶
wget https://ilogtail-community-edition.oss-cn-shanghai.aliyuncs.com/latest/ilogtail-latest.linux-amd64.tar.gz
Create Configuration File¶
After extracting, create a file named file_sample.yaml in the ilogtail-1.7.1/user_yaml_config.d directory.
This configuration collects logs from the /usr/local/df-demo/log-demo/logs/log.log file.
enable: true
inputs:
- Type: file_log
LogPath: /usr/local/df-demo/log-demo/logs # File path
FilePattern: log.log # File name
flushers:
- Type: flusher_stdout
OnlyStdout: true
- Type: flusher_http
RemoteURL: "http://localhost:9529/v1/write/logstreaming"
Start iLogtail¶
Manually input logs
At this point, Guance has received the logs pushed by iLogtail.
pipeline Configuration¶
The logs used here are generated by JAVA applications in the file /usr/local/df-demo/log-demo/logs/log.log. Since the log format differs, the value of content in the diagram will vary, and the pipeline needs to be adjusted accordingly.
The logback.xml output log format in Java is as follows, and the output content corresponds to the value of content in the diagram above.
%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{20} - [%method,%line] - [%X{dd.service}][%X{dd.trace_id}][%X{dd.span_id}] - %msg%n
Below is the written Pipeline to split the complete log content, mainly extracting information such as host, host_ip, trace_id from the logs.
【Logs】-> 【Text Processing (Pipeline)】-> Create a new Pipeline. Input "default" as the log source, enter the parsing rules, and save.
json_data = load_json(_)
contents = json_data["contents"]
tags = json_data["tags"]
content = contents["content"]
grok(content, "%{TIMESTAMP_ISO8601:time}%{SPACE}\\[%{NOTSPACE:thread_name}\\]%{SPACE}%{LOGLEVEL:status}%{SPACE}%{NOTSPACE:class_name}%{SPACE}%{SPACE}-%{SPACE}\\[%{NOTSPACE:method_name},%{NUMBER:line}\\]%{SPACE}-%{SPACE}\\[%{DATA:service_name}\\]\\[%{DATA:trace_id}\\]\\[%{DATA:span_id}\\]%{SPACE}-%{SPACE}%{GREEDYDATA:msg}")
add_key(message, content)
add_key(host_ip, tags["host.ip"])
add_key(host, tags["host.name"])
add_key(filepath, tags["log.file.path"])
Start the JAVA application to generate logs, which will automatically trigger the iLogtail log collection.
Enter Guance, you can see that the logs have been collected, and the pipeline is functioning properly.