Kubernetes Cluster Log Aggregation with Filebeat


Finally the Kubernetes cluster I was working on went live, and I didn’t provide a log aggregation solution yet. I had a look at dynaTrace, which is a paid SaaS. However it requires to install some agent in every container. It’s fun when there’s only several to play with but I wouldn’t rebuild dozens of docker containers just to get logs out.

Luckily enough I found Filebeat from Elastic which can be installed as a DaemonSet in a Kubernetes cluster and then pipe all logs to Elasticsearch and I already have an Elasticsearch cluster running so why not. The installation is quite easy following this guide:

1, Download the manifest

2, The only configuration needs to be changed are:

 env:
   - name: ELASTICSEARCH_HOST
     value: 10.1.1.10
   - name: ELASTICSEARCH_PORT
     value: "9200"
   - name: ELASTICSEARCH_USERNAME
     value: elastic
   - name: ELASTICSEARCH_PASSWORD
     value: changeme

Then load it to the kubernetes cluster:

kubectl apply -f filebeat.yaml

3, If the docker containers running in the cluster already logging to stdout/stderr, you should see logs flowing into Elasticsearch, otherwise check Filebeat logs in Kubernetes dashboard(it’s in kube-system name space).

4, Make sure to create an index for filebeat in Kibana, usually filebeat-*

That’s about it 🙂