I want in one instance there has to be two logstash pipelines running,one of them's output will be the other one's input. I have read below documentations,
https://www.elastic.co/guide/en/logstash/current/ls-to-ls.html https://www.elastic.co/guide/en/logstash/current/pipeline-to-pipeline.html#pipeline-to-pipeline-overview https://www.elastic.co/guide/en/logstash/current/plugins-inputs-lumberjack.html
I'm confused about which approach I should follow. The thing I want is below :
The first logstash :
input {
# someplugins and codecs are here
}
filter {
# some operations here
}
output {
elasticsearch {
...
}
file {
...
}
logstash {
}
}
Second one like below :
input {
logstash {
}
}
filter {
#some operations are here
}
output {
elasticsearch {
}
}
I know there is no plugin that name is logstash. I put that name for explaining the situations. So, for this purpose what should i follow ? Should i need message queue (kafka,redis) or lumberjack protocol or should i need beats for this purpose or is there any better alternative ?
Can someone answer with basic pipeline for this purpose ?
Thanks for answering
The below one is the easiest way that the solves question.
- pipeline.id: first_pipeline
config.string:
input { stdin { } }
output { pipeline { send_to => [commonOut] } }
- pipeline.id: second_pipeline
config.string: |
input { pipeline { address => commonOut } }
filter {
}
output { stdout { codec => rubydebug } }