elasticsearch - How to define seperated indexes for different logs in Filebeat/ELK? -


i wondering how create separated indexes different logs fetched logstash (which later passed onto elasticsearch), in kibana, can define 2 indexes them , discover them.

in case, have few client servers (each of installed filebeat) , centralized log server (elk). each client server has different kinds of logs, e.g. redis.log, python logs, mongodb logs, sort them different indexes , stored in elasticsearch.

each client server serves different purposes, e.g. databases, uis, applications. hence give them different index names (by changing output index in filebeat.yml?).

in filebeat configuration can use document_type identify different logs have. inside of logstash can set value of type field control destination index.

however before separate logs different indices should consider leaving them in single index , using either type or custom field distinguish between log types. see index vs type.

example filebeat prospector config:

filebeat:   prospectors:     - paths:         - /var/log/redis/*.log       document_type: redis      - paths:         - /var/log/python/*.log       document_type: python      - paths:         - /var/log/mongodb/*.log       document_type: mongodb 

example logstash config:

input {   beats {     port => 5044   } }  output {   # customize elasticsearch output filebeat.   if [@metadata][beat] == "filebeat" {     elasticsearch {       hosts => "localhost:9200"       manage_template => false       # use filebeat document_type value elasticsearch index name.       index => "%{[@metadata][type]}-%{+yyyy.mm.dd}"       document_type => "log"     }   } } 

Comments

Popular posts from this blog

Spring Boot + JPA + Hibernate: Unable to locate persister -

go - Golang: panic: runtime error: invalid memory address or nil pointer dereference using bufio.Scanner -

c - double free or corruption (fasttop) -