【发布时间】:2020-10-06 00:33:58
【问题描述】:
让我解释一下我现有的结构,我有 4 台服务器(Web 服务器、API 服务器、数据库服务器、SSIS 服务器)并在所有四台服务器中安装了 filebeat 和 winlog,从那里我在我的 logstash 中获取所有日志,但是在这里是我在消息正文中收到的每条日志的内容,对于某些消息,我很难编写正确的 GROK 模式,无论如何我可以从 Kibana 获取模式(仅供参考,我现在将所有日志存储在 elasticsearch 中,我可以看穿 Kibana。)
我的 Logstash 配置看起来像 -
1. Api-Pipeline
input {
beats {
host => "IP Address where my filebeat (API Server) is running"
port => 5044
}
}
2. DB Pipeline
input {
beats {
host => "IP Address where my filebeat (Database Server) is running"
port => 5044
}
}
当我只使用端口时它正在工作,而当我添加主机时它停止工作。谁能帮帮我。
下面我正在努力实现
我在这里进行了更改,它是否有效,因为我需要编写冗长的过滤器,这就是我想在单独的文件中拥有的原因
Filebeat.yml on API Server
-----------------------------------------------------------------------------------------
filebeat.inputs:
- type: log
source: 'ApiServerName' // MyAPIServerName(Same Server Where I have installed filebeat)
enabled: true
paths:
- C:\Windows\System32\LogFiles\SMTPSVC1\*.log
- E:\AppLogs\*.json
scan_frequency: 10s
ignore_older: 24h
filebeat.config.modules:
path: C:\Program Files\Filebeat\modules.d\iis.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
setup.kibana:
host: "kibanaServerName:5601"
output.logstash:
hosts: ["logstashServerName:5044"]
Logstash Configuration
----------------------------------------------------------------
Pipeline.yml
- pipeline.id: beats-server
config.string: |
input { beats { port => 5044 } }
output {
if [source] == 'APISERVERNAME' {
pipeline { send_to => apilog }
} else if [source] == 'DBSERVERNAME' {
pipeline { send_to => dblog }
}
else{
pipeline { send_to => defaultlog }
}
}
- pipeline.id: apilog-processing
path.config: "/Logstash/config/pipelines/apilogpipeline.conf"
- pipeline.id: dblog-processing
path.config: "/Logstash/config/pipelines/dblogpipeline.conf"
- pipeline.id: defaultlog-processing
path.config: "/Logstash/config/pipelines/defaultlogpipeline.conf"
1. apilogpipeline.conf
----------------------------------------------------------
input {
pipeline {
address => apilog
}
}
output {
file {
path => ["C:/Logs/apilog_%{+yyyy_MM_dd}.log"]
}
}
2. dbilogpipeline.conf
---------------------------------------------------------
input {
pipeline {
address => dblog
}
}
output {
file {
path => ["C:/Logs/dblog_%{+yyyy_MM_dd}.log"]
}
}
3. defaultlogpipeline.conf
---------------------------------------------------------
input {
pipeline {
address => defaultlog
}
}
output {
file {
path => ["C:/Logs/defaultlog_%{+yyyy_MM_dd}.log"]
}
}
【问题讨论】:
标签: elasticsearch logstash kibana filebeat