Don’t Need Ngrok When I Have SSH

I was trying to create a Slack app. In order to let Slack send REST requests to my dev environment, eg. http://localhost:9000, I searched a bit and saw ngrok. Ngrok is very handy for this kind of setup:

Slack -> -> localhost

However I just don’t want to install anything so I turned to Google and to my surprise SSH can exactly do this(for who knows how many years). I know I can forward a local port to a remote host to connect to a service behind firewall such as databases, this is my first attempt to forward a remote port to local so Slack API can contact my localhost.

Here’s a better article which explained how to do port forwarding in both directions with SSH.

In short, to forward a remote port to my localhost, I need to

1, update the sshd_config on remote host and have GatewayPorts enabled and then restart SSH service

GatewayPorts yes

2, in a local terminal, run the following command replacing with your server’s domain or IP.

ssh -nNT -R 9800:localhost:9000

Then test it with

curl -i

The request should be forwarded to your localhost:9000.


Kubernetes Log Aggregation with Filebeat and Logstash

Following last blog, Filebeat is very easy to setup however it doesn’t do log pattern matching, guess I’ll need Logstash after all.

First is to install Logstash of course. To tell Filebeat to feed to Logstash instead of Elasticsearch is straightforward, here’s some configuration snippets:

Filebeat K8s configMap:

apiVersion: v1
kind: ConfigMap
  name: filebeat-config
  namespace: kube-system
    k8s-app: filebeat "true"
  filebeat.yml: |-
  # replace output.elasticsearch with this
    hosts: ['${LOGSTASH_HOST:logstash}:${LOGSTASH_PORT:5044}']

Sample Logstash configuration:

input {
  beats {
    port => "5044"
filter {
  grok {
    match => { "message" => "%{COMBINEDAPACHELOG}"}
output {
  elasticsearch {
    hosts => [ "localhost:9200" ]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"

COMBINEDAPACHELOG is the standard apache log format(as well as nginx’s). By using this predefined log format, values like request URI or referrer URL will be available as fields in Elastisearch.


Kubernetes Cluster Log Aggregation with Filebeat

Finally the Kubernetes cluster I was working on went live, and I didn’t provide a log aggregation solution yet. I had a look at dynaTrace, which is a paid SaaS. However it requires to install some agent in every container. It’s fun when there’s only several to play with but I wouldn’t rebuild dozens of docker containers just to get logs out.

Luckily enough I found Filebeat from Elastic which can be installed as a DaemonSet in a Kubernetes cluster and then pipe all logs to Elasticsearch and I already have an Elasticsearch cluster running so why not. The installation is quite easy following this guide:

1, Download the manifest

2, The only configuration needs to be changed are:

     value: "9200"
     value: elastic
     value: changeme

Then load it to the kubernetes cluster:

kubectl apply -f filebeat.yaml

3, If the docker containers running in the cluster already logging to stdout/stderr, you should see logs flowing into Elasticsearch, otherwise check Filebeat logs in Kubernetes dashboard(it’s in kube-system name space).

4, Make sure to create an index for filebeat in Kibana, usually filebeat-*

That’s about it 🙂

Time Machine for Arch Linux

I’ve been using Arch Linux for some years, and it’s still my favorite Linux distribution. The feature that distinguished Arch from others is its rolling release which means there’s no such a thing called version in Arch. Using latest packages in Arch is the norm.

However living on the edge means it’s not quite safe. After I installed a bunch of updates including Gnome Shell 3.28, my XPS 15 laptop had trouble to bring up external monitor. It even froze when I plug the HDMI in hot.

I tried to revert some packages like

sudo pacman -U /var/cache/packman/pkg/some-package-1.0.xx.pkg.tar.xz

But it didn’t solve the problem because there were hundreds of packages in last update.

Almost going to panic, I found this instruction to revert all packages to a snapshot in time. And it actually worked wonders for me.

Only surprise is when downgrading packages, I saw errors like

 package-name: /path/to/package-file exists in filesystem

Guess it’s a safe guarding mechanism of pacman but since I know what I was doing so I simply deleted those files. The final command is

sudo pacman -Syyuu

which will bring Arch Linux back to a point of time and the issue has been fixed 🙂