I’m remaking this website in the open. Follow along, if you wish: Writing, RSS, Source
There is much to improve about this site. (If you’re reading this close to the publish date, that will be obvious, as you view the nearly naked prose and utter lack of design.) But making improvements without measuring the impact of the changes can risk making it worse. So, before I do any more work, I need a baseline from which to measure my changes against.
Lighthouse is a helpful tool to audit a website in five main criteria: performance, accessibility, best practices, SEO, and progressive web app (PWA). Those align nicely with my goals for the site, making it an ideal fit for my needs. Lighthouse also does something quite neat and very much in line with the “learn, then teach” spirit of this site. Reporting the audit successes & failures would be useful on its own, but the Lighthouse team went much further and provide the why behind each check, many with a “learn more” link. This greatly shortens the path from “oh no, not good” to “I learned a thing!”.
There are many ways to run Lighthouse. The easiest is probably using the built-in Audits panel of Chrome’s DevTools. That’s fine for a manual workflow, but I want to automate the audits, to ensure they run against every change to the site and surface the results well. For that, I’ll use Lighthouse CI, which I can run within my GitHub Workflow. Doing so is fairly straightforward:
Add Lighthouse step to workflow (commit)
Though it’s all in one commit, this contains multiple tasks.
Turn off failing assertions (commit)
While I haven’t done anything to intentionally make the performance or accessibility of this site worse (to then improve in a helpful blog post), I haven’t done anything extra to improve those things either. Thus, the recommended assertions return [four failed checks]. For now, I’ll disable those checks to get a passing audit, and will address each of them as my very next tasks.
Now whenever I push a change to the site, a Lighthouse run is kicked off as part of my CI/CD workflow, and the results are published as both a status check (if pushing to a branch with a pull request) and a detailed report (not that exact report, as the generated one is only hosted temporarily; that link is from a manual audit using https://web.dev/measure/, and if it continues to work, it will probably not reflect the state of the site when this post was published).
Most importantly, I can now continue to improve the site knowing that my changes are truly having a positive impact on the things I care about.
I’m happy with this initial implementation, but there are aspects to improve.
Only run the second deploy step if pushing to master (in other words, if deploying to production) to avoid creating a duplicate deploy preview. GitHub workflows cannot have conditional steps, only conditional jobs. So I’d first have to split the steps into various jobs, and that would require passing data between jobs unless I duplicate a lot of the steps and logic.
After splitting steps into different jobs, the Lighthouse job could be conditional, allowing it to be skipped — for example, by adding “[skipLighthouse]” to a commit message—which would be nice for writing only updates (like publishing this post).
Publish reports to a more permanent location, using Lighthouse CI Server.
Tweak assertions, perhaps making use of a performance budget.