Talk to people,
I need a help ...
I have the following Logstash configuration files:
agent.conf
input {
log4j {
type => "bdj"
port => 25827
}
}
filter{
json{
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
redis {
host => "127.0.0.1"
data_type => "list"
key => "logstash"
}
}
This agent.conf receives the logs via tcp and forwards the redis.
central.conf
input {
redis {
host => "localhost"
type => "redis-input"
data_type => "list"
key => "logstash"
}
}
filter{
json{
source => "message"
}
}
output {
stdout { }
elasticsearch {
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
}
}
The central.conf , in turn, captures the redis data and forwards to the elastichsarch.
The problem is that the data is being duplicated, as if in a loop, or something like that.
I'm running logstash on a Debian, as a service.
root@logs:~# uname -a
Linux logs 3.2.0-4-amd64 #1 SMP Debian 3.2.78-1 x86_64 GNU/Linux
Any light?