[Troubleshooting] Elasticsearch missing logs or getting empty logs due to Ingest pipelines

Jasmine H
1 min readOct 13, 2021

Situation:

  1. Elasticsearch missing logs from fluentd, or
  2. Logs are sent to elasticsearch but message field is an empty string.

Troubleshooting steps:

  • Edit fluentd config to get more detailed log: set @log_level debug

and log_es_400_reason truein <match> paragraph.

  • Check fluentd logs
2021–10–18T14:43:40.995128472+08:00 stdout F 2021–10–08 06:43:40 +0000 [warn]: #0 dump an error event: error_class=Fluent::Plugin::ElasticsearchErrorHandler::ElasticsearchError error=”400 — Rejected by Elasticsearch [error type]: illegal_argument_exception [reason]: ‘com.fasterxml.jackson.core.JsonParseException: Unexpected character (‘-’ (code 45)): was expecting comma to separate Array entries\n at [Source: (org.elasticsearch.common.io.stream.ByteBufferStreamInput); line: 1, column: 7]’” location=nil tag=”kubernetes.var.log.containers………

Root cause:

In my case, this error comes from Ingest pipeline. I’ve set a JSON processor to parse JSON message into fields :

  {
"json": {
"field": "message",
"target_field": "message_json"
}
}

When some incoming log messages are not in JSON format, JsonParseException: Unexpected character error happens.

Solution:

Add "ignore_failure": truein JSON ingest pipeline :

{
"json": {
"field": "message",
"target_field": "message_json",
"ignore_failure": true
}
}

--

--

Jasmine H

Data Engineer from Taiwan, recently working on EFK and Kubernetes projects.