I've a database table which contains 400+ records, but each time, I use logstash and JDBC it only gets 126 records. I tried many times with different indices each time, but still the 126 only retrieving.
Here is the output configurations:
input {
jdbc {
-
-
jdbc_driver_class => "org.postgresql.Driver"
statement => "SELECT * from xfailure"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "test_predictions"
document_id => "%{id}"
}
}
When I check the number of rows, I found that it has 400+ records here is the select statment:
SELECT * FROM xfailure;
here is a sample of the record columns:
id|modified|x_url|x_id|correlation_id|links|code|description|explanation|tr_id
Whem I create an index on Kibanna, I used the timestamp as time field, but the number of hits is just 162 hits!
I found a solution for this which is specifying each field so by changin the sql query from:
SELECT * FROM xfailure;
to
SELECT field_1 as field_1, field_2 as field_2 FROM xfailure;
And it now works fine.