I am ingesting SQL Server ErrorLog files using Filebeat with the mssql module directly to Elastic search output but I am getting characters like this \u00000\u00002\u00004\u0000-\u00000\u00004\u0000-\u00002\u00005\u0000 \u00002\u00002\u0000:\u00001\u00000\u0000:\u00000\u00003\u0000.\u00002\u00005\u0000' could not be parsed at index 0 in the message field.
Is there anything I can do to fix this which can get these logs line by line into elastic search.
I am currently considering ingesting these logs into Logstash which then sends to an elastic search index for me.
My module config (mssql.yml) is currently like this:
- module: mssql
# Fileset for native deployment
log:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ['D:\path_to_log\Log\ERRORLOG*']
encoding: utf-16
I think this should work.
- module: mssql
log:
enabled: true
var.paths: ['D:\path_to_log\Log\ERRORLOG*']
input:
encoding: utf-16le
exclude_files: [".gz$"]
multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}[[:space:]][0-9]{2}:[0-9]{2}:[0-9]{2}\.[0-9]{2}'
multiline.negate: true
multiline.match: after
But it would be better to see an example error log and your elastic stack version.