Closed
Description
Bug report
System info:
Telegraf 1.0.0 running under Docker, using the docker image obtained from Docker Hub.
Steps to reproduce:
- Run Telegraf under Docker, configured to consume from a Kafka topic.
- Run InfluxDB to be populated from Telegraf
- Fill the Kafka topic with valid JSON data
Expected behavior:
The JSON data should be consumed without error
Actual behavior:
After approximately 350000 metrics have been consumed, Telegraf panics with the following message:
2016/09/14 01:00:52 Output [influxdb] wrote batch of 1000 metrics in 125.002825ms
2016/09/14 01:00:52 Output [influxdb] wrote batch of 1000 metrics in 96.751418ms
2016/09/14 01:00:58 ERROR: Post http://influxdb:8086/write?consistency=any&db=telemetry&precision=ns&rp=: net/http: request canceled (Client.Timeout exceeded while awaiting headers)
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x0 pc=0x5a48d6]
goroutine 30 [running]:
panic(0x12cefe0, 0xc82000a090)
/usr/local/go/src/runtime/panic.go:481 +0x3e6
github.com/influxdata/telegraf/plugins/inputs/kafka_consumer.(*Kafka).receiver(0xc8200b40c0)
/home/ubuntu/telegraf-build/src/github.com/influxdata/telegraf/plugins/inputs/kafka_consumer/kafka_consumer.go:131 +0x276
created by github.com/influxdata/telegraf/plugins/inputs/kafka_consumer.(*Kafka).Start
/home/ubuntu/telegraf-build/src/github.com/influxdata/telegraf/plugins/inputs/kafka_consumer/kafka_consumer.go:117 +0x232
Additional info:
Kafka Consumer configuration:
[[inputs.kafka_consumer]]
## topic(s) to consume
topics = ["nom-telemetry"]
## an array of Zookeeper connection strings
zookeeper_peers = ["nom-kafka:2181"]
## Zookeeper Chroot
zookeeper_chroot = "/kafka"
## the name of the consumer group
consumer_group = "telegraf_metrics_consumers"
## Offset (must be either "oldest" or "newest")
offset = "oldest"
## Data format to consume.
## Each data format has it's own unique set of configuration options, read
## more about them here:
## https://github.com/influxdata/telegraf/blob/master/docs/DATA_FORMATS_INPUT.md
data_format = "json"