Thanks to my Colleague Simon’s suggestion, I was introduced to Google Lighthouse, an opensource nodejs framework to use Google Chrome to audit a website’s performance.
I like Lighthouse because:
- opensource
- good portability
- can run as CLI command or as a nodejs module
Here’s a sample Dockerfile to have a container ready to run Lighthouse with Google Chrome for Linux.
FROM debian:stretch USER root WORKDIR /root ENV CHROME_VERSION="google-chrome-stable" # system packages RUN apt update -qqy && \ apt install -qqy build-essential gnupg wget curl jq # nodejs 10 RUN curl -sL https://deb.nodesource.com/setup_10.x | bash - && \ apt install -qqy nodejs && \ npm install -g lighthouse # google-chrome RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add - && \ echo "deb http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google-chrome.list && \ apt update -qqy && \ apt install -qqy ${CHROME_VERSION:-google-chrome-stable} # python3 (optional for metric processing) RUN apt install -qqy python3 python3-pip && \ pip3 install influxdb # lighthouse RUN useradd -ms /bin/bash lighthouse USER lighthouse WORKDIR /home/lighthouse
Then lighthouse can be executed in the container to audit $url:
CHROME_PATH=$(which google-chrome) lighthouse $url --emulated-form-factor=none --output=json --chrome-flags="--headless --no-sandbox"
The result json will be sent to stdout, and it can be easily piped to other scripts for post processing, eg. parse json and extract metrics, etc…
🙂