Skip to content

The hunt for a kernel bug, part 4: git bisect

Now that I have a way to compile kernels from source, I want to find the exact commit where my input devices stop working. That means doing a git checkout of a certain commit, build the kernel, install it, reboot, select the new kernel in Grub, and see if my keyboard works. I am quite sure that I need to search between 5.13.0-22 and 5.13.0-23, but that’s still 634 commits!

$ git rev-list Ubuntu-5.13.0-22.22..Ubuntu-5.13.0-23.23 | wc --lines
634

This is where git bisect comes in. It’s sort of a wizard that guides you to find a bad commit. You tell it on which commit your software was known to work ok, and a commit where it doesn’t. It then picks a commit somewhere in the middle, you build your software and do your tests, and then tell git bisect if the result was good or bad. It will then give you a new commit to inspect, each time narrowing the search.

git-bisect-sumo-logic
git bisect

Let’s do this!

$ git bisect start
$ git bisect good Ubuntu-5.13.0-22.22
$ git bisect bad Ubuntu-5.13.0-23.23
Bisecting: 316 revisions left to test after this (roughly 8 steps)
[398351230dab42d654036847a49a5839705abdcb] powerpc/bpf ppc32: Fix BPF_SUB when imm == 0x80000000
$ git describe --long 
Ubuntu-5.13.0-22.22-317-g398351230dab

In this first step, I get the 317th commit after 5.13.0-22. Let’s compile that commit:

$ time make clean olddefconfig bindeb-pkg \
    --jobs=$(getconf _NPROCESSORS_ONLN) \
    LOCALVERSION=-$(git describe --long | tr '[:upper:]' '[:lower:]')

This creates 3 .deb packages in the directory above:

$ ls -1 ../*$(git describe --long | tr '[:upper:]' '[:lower:]')*.deb
../linux-headers-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab_5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab-10_amd64.deb
../linux-image-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab_5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab-10_amd64.deb
../linux-libc-dev_5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab-10_amd64.deb

I only need to install the headers and the image, libc-dev isn’t needed.

$ sudo dpkg --install ../linux-{headers,image}-*$(git describe --long | tr '[:upper:]' '[:lower:]')*.deb

Verify that the kernel files are in the /boot directory:

$ ls -1 /boot/*$(git describe --long | tr '[:upper:]' '[:lower:]')*
/boot/config-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab
/boot/initrd.img-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab
/boot/System.map-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab
/boot/vmlinuz-5.13.19-ubuntu-5.13.0-22.22-317-g398351230dab

Now I can reboot, select the new kernel in Grub, and test the keyboard. With commit 317, the keyboard worked, so the first bad commit has to be somewhere between commit 317 and 634:

$ git bisect good ; git describe --long
Bisecting: 158 revisions left to test after this (roughly 7 steps)
[79b62d0bba892e8367cb46ca09b623c885852c29] drm/msm/a4xx: fix error handling in a4xx_gpu_init()
Ubuntu-5.13.0-22.22-475-g79b62d0bba89

Now it’s time again for make clean olddefconfig bindeb-pkg, dpkg --install and reboot. Turns out that commit 475 was a “bad” commit (one where the keyboard didn’t work):

$ git bisect bad ; git describe --long
Bisecting: 78 revisions left to test after this (roughly 6 steps)
[c3d35f3acc3a11b726959c7b2c25ab9e46310273] USB: serial: option: add Telit LE910Cx composition 0x1204
Ubuntu-5.13.0-22.22-396-gc3d35f3acc3a

I’m not going to describe all the steps in full detail, by now you should get the gist of it. This was the sequence of steps that git bisect gave me:

  • 317: good
  • 475: bad
  • 396: bad
  • 356: good
  • 376: good
  • 386: good
  • 391: bad
  • 388: bad
  • 387: bad

And then we finally get the first bad commit, the 387th commit after 5.13.0-22, Ubuntu-5.13.0-22.22-387-g0fc979747dec:

$ git bisect bad ; git describe --long
0fc979747dece96c189bc29ef604e61afbddfa2a is the first bad commit
commit 0fc979747dece96c189bc29ef604e61afbddfa2a
Author: Pavankumar Kondeti <pkondeti@codeaurora.org>
Date:   Fri Oct 8 12:25:46 2021 +0300

    xhci: Fix command ring pointer corruption while aborting a command
    
    BugLink: https://bugs.launchpad.net/bugs/1951880
    
    commit ff0e50d3564f33b7f4b35cadeabd951d66cfc570 upstream.
    
    The command ring pointer is located at [6:63] bits of the command
    ring control register (CRCR). All the control bits like command stop,
    abort are located at [0:3] bits. While aborting a command, we read the
    CRCR and set the abort bit and write to the CRCR. The read will always
    give command ring pointer as all zeros. So we essentially write only
    the control bits. Since we split the 64 bit write into two 32 bit writes,
    there is a possibility of xHC command ring stopped before the upper
    dword (all zeros) is written. If that happens, xHC updates the upper
    dword of its internal command ring pointer with all zeros. Next time,
    when the command ring is restarted, we see xHC memory access failures.
    Fix this issue by only writing to the lower dword of CRCR where all
    control bits are located.
    
    Cc: stable@vger.kernel.org
    Signed-off-by: Pavankumar Kondeti <pkondeti@codeaurora.org>
    Signed-off-by: Mathias Nyman <mathias.nyman@linux.intel.com>
    Link: https://lore.kernel.org/r/20211008092547.3996295-5-mathias.nyman@linux.intel.com
    Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
    Signed-off-by: Kamal Mostafa <kamal@canonical.com>
    Signed-off-by: Stefan Bader <stefan.bader@canonical.com>

 drivers/usb/host/xhci-ring.c | 14 ++++++++++----
 1 file changed, 10 insertions(+), 4 deletions(-)
Ubuntu-5.13.0-22.22-387-g0fc979747dec

At first sight the commit description is quite cryptic, and the actual code change doesn’t tell me a lot either. But it’s a change in drivers/usb/host/xhci-ring.c, and xhci stands for eXtensible Host Controller Interface, and interface specification for USB host controllers. If it’s an issue with the USB host controller, then it makes sense that if I use 2 keyboards from different brands, neither of them would work. It also suggests that other USB devices, like external hard drives, wouldn’t work either, but that’s a bit harder to test. A keyboard is easy. Just look at NumLock LED, if it doesn’t go on then there’s an issue.

The first link in the commit description is just a long list of patches that were taken from upstream and integrated in the Ubuntu kernel, so that doesn’t help me. The second link is a thread on the kernel.org mailing list, and there it gets interesting.

kernel.org mailing list thread

Some excerpts from the thread:

This patch cause suspend to disk resume usb not work, xhci_hcd 0000:00:14.0: Abort failed to stop command ring: -110.

youling257

Thanks for the report, this is odd.
Could you double check that by reverting this patch resume start working again.
If this is the case maybe we need to write all 64bits before this xHC hardware reacts to CRCR register changes.
Maybe following changes on top of current patch could help:

Mathias Nyman

Every time a developer says “this is odd”, my alarm bells go off. 😀

Further down in the thread there is a proposed update to the change. I’m going to try that patch, but that’s for another blog post.

workplace with modern laptop with program code on screen

The hunt for a kernel bug, part 3: compiling a kernel

Compiling a Linux kernel sounds scary and complicated, but I found out it actually isn’t.

The first thing to do, is to install some prerequisites:

$ sudo apt install --yes asciidoc binutils-dev bison build-essential ccache \
    crash dwarves fakeroot flex git git-core git-doc git-email kernel-package \
    kernel-wedge kexec-tools libelf-dev libncurses5 libncurses5-dev libssl-dev \
    makedumpfile zstd
$ sudo apt-get --yes build-dep linux

Next I cloned the Ubuntu Impish repository. This takes a while…

$ git clone git://kernel.ubuntu.com/ubuntu/ubuntu-impish.git
$ cd ubuntu-impish

Now let’s see which versions are in the repository:

$ git tag --list
Ubuntu-5.11.0-16.17
Ubuntu-5.11.0-18.19+21.10.1
Ubuntu-5.11.0-20.21+21.10.1
Ubuntu-5.13.0-11.11
Ubuntu-5.13.0-12.12
Ubuntu-5.13.0-13.13
Ubuntu-5.13.0-14.14
Ubuntu-5.13.0-15.15
Ubuntu-5.13.0-16.16
Ubuntu-5.13.0-17.17
Ubuntu-5.13.0-18.18
Ubuntu-5.13.0-19.19
Ubuntu-5.13.0-20.20
Ubuntu-5.13.0-21.21
Ubuntu-5.13.0-22.22
Ubuntu-5.13.0-23.23
Ubuntu-5.13.0-24.24
Ubuntu-5.13.0-25.26
Ubuntu-5.13.0-26.27
Ubuntu-5.13.0-27.29
Ubuntu-5.13.0-28.31
Ubuntu-5.13.0-29.32
Ubuntu-5.13.0-30.33
Ubuntu-5.13.0-31.34
Ubuntu-5.13.0-32.35
freeze-20211018
freeze-20211108
freeze-20220131
freeze-20220221
v5.11
v5.13

The two tags that interest me, are Ubuntu-5.13.0-22.22 and Ubuntu-5.13.0-23.23. I’m starting with the former.

git checkout Ubuntu-5.13.0-22.22

First I copy the configuration of the current running kernel to the working directory:

$ cp /boot/config-$(uname --kernel-release) .config

I don’t want or need full debugging. That makes an enormous kernel and it takes twice as long to compile, so I turn debugging off:

$ scripts/config --disable DEBUG_INFO

I need to disable certificate stuff:

$ scripts/config --disable SYSTEM_TRUSTED_KEYS
$ scripts/config --disable SYSTEM_REVOCATION_KEYS

Next: update the kernel config and set all new symbols to their default value.

$ make olddefconfig

Then the most exciting thing can start: actually compiling the kernel!

$ make clean
$ time make --jobs=$(getconf _NPROCESSORS_ONLN) bindeb-pkg \
    LOCALVERSION=-$(git describe --long | tr '[:upper:]' '[:lower:]')
  • time is to see how long the compilation took.
  • getconf _NPROCESSORS_ONLN queries the number of processors on the computer. make will then try to run that many jobs in parallel.
  • bindeb-pkg will create .deb packages in the directory above.
  • LOCALVERSION appends a string to the kernel name.
  • git describe --long shows how far after a tag a certain commit is. In this case: Ubuntu-5.13.0-22.22-0-g3ab15e228151
    • Ubuntu-5.13.0-22.22 is the tag.
    • 0 is how many commits after the tag. In this case it’s the tag itself.
    • 3ab15e228151 is the abbreviated hash of the current commit.
  • tr '[:upper:]' '[:lower:]' is needed because .deb packages can’t contain upper case letters (I found out the hard way).

Now go grab a coffee, tea or chai latte. Compilation took 22 minutes on my computer.

Chai latte

When the compilation is done, there are 3 .deb packages in the directory above:

$ ls -1 ../*.deb
../linux-headers-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151_5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151-21_amd64.deb
../linux-image-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151_5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151-21_amd64.deb
../linux-libc-dev_5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151-21_amd64.deb

Install the linux-headers and the linux-image packages, you don’t need the libc-dev package.

$ sudo dpkg --install \
    ../linux-{headers,image}-*$(git describe --long | tr '[:upper:]' '[:lower:]')*.deb

The kernel is now installed in the /boot directory, and it’s available in the Grub menu after reboot.

$ ls -1 /boot/*$(git describe --long | tr '[:upper:]' '[:lower:]')*
/boot/config-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151
/boot/initrd.img-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151
/boot/System.map-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151
/boot/vmlinuz-5.13.19-ubuntu-5.13.0-22.22-0-g3ab15e228151

Kernel ubuntu-5.13.0-22.22-0-g3ab15e228151 is, for all intents and purposes, the same as kernel 5.13.0-22-generic, so I expected it to be a “good” kernel, and it was.

For kernel Ubuntu-5.13.0-23.23 I did the same thing: starting from the git checkout. I skipped copying and editing the config file, because between minor releases I don’t expect there to be much change. I did run make olddefconfig for good measure, though. As expected, the keyboard and mouse didn’t work with the compiled ...-23 kernel.

Next up: using git bisect to find the exact commit where it went wrong. It’s got to be somewhere between ...-22 and ...-23!

The hunt for a kernel bug, part 2: an easy way to install mainline kernels

As I wrote previously, I’m suspecting a Linux kernel bug somewhere between versions 5.13.0-22 and 5.13.0-23, in the Ubuntu kernels. I wanted to know if the issue only surfaced in Ubuntu-flavored kernels, or also in the upstream (mainline) kernels from kernel.org.

There is an Ubuntu Mainline PPA with all the upstream kernels, but I found it a bit too opaque to use. Fortunately I found the Ubuntu Mainline Kernel Installer (UMKI), a tool for installing the latest Linux kernels on Ubuntu-based distributions.

Ubuntu Mainline Kernel Installer (UMKI)

The UMKI is pretty straightforward. It fetches a list of kernels from the Ubuntu Mainline PPA and a GUI displays available and installed kernels, regardless of how they were installed. It installs the kernel, headers and modules. There is also a CLI client.

To install the UMKI:

sudo add-apt-repository ppa:cappelikan/ppa
sudo apt update
sudo apt install mainline

With that out of the way, there’s the matter of deciding which kernels to try. The “interesting” Ubuntu kernels are 5.13.0-22 and 5.13.0-23, so the mainline kernels I definitely want to test, are around those versions. That means 5.13.0 and 5.13.1. I also want to try the latest 5.13.x kernel, so that’s 5.13.19, and the most recent stable kernel, 5.16.11 (as of 2022-03-01).

To summarize, I have tested these mainline kernels:

  • 5.13.0
  • 5.13.1
  • 5.13.19
  • 5.16.11

The result (after several reboots)? With all of them, my keyboard and mouse worked without a hitch. That means the issue most likely doesn’t occur in (stable) mainline kernels, only in kernels with additional patches from Ubuntu.

Up next: compiling kernels from source.

Lasciate ogne speranza, voi ch’intrate.

Dante Alighieri

nature insect ladybug bug

The hunt for a kernel bug, part 1

The operating system on my computer is Ubuntu Linux, version 21.10 (Impish Indri). Recently I had an issue that, after a kernel update (and reboot), my USB keyboard and mouse didn’t work any more in the login screen. Huh, that’s unexpected.
The issue was:

  • At the Grub boot menu, the keyboard works: I can use the keys, the numlock led lights up, the LCD of the Logitech G19 displays a logo.
  • At the Ubuntu login screen, the keyboard (and the mouse) went dark: no backlight of the keys, no numlock led, no logo on the display. And the mouse cursor didn’t move on screen.

Must be a problem at my end, I initially thought, because surely, something so essential as input devices wouldn’t break by a simple kernel update? So I did some basic troubleshooting:

  • Have you tried to turn it off and on again?
Have you tried to turn it off and on again?
Have you tried to turn it off and on again?
  • Plug the keyboard in another USB port.
  • Try a different keyboard.
  • Start with the older kernel, which was still in the Grub menu. And indeed, this gave me back control over my input devices!

So if the only thing I changed was the kernel, then maybe it’s a kernel bug after all?

I know that Ubuntu 21.10 uses kernel 5.something, and I know that I use the generic kernels. So which kernels are we talking about, actually?

$ apt-cache show linux-image-5*-generic | grep Package: | sed 's/Package: //g'
linux-image-5.13.0-19-generic
linux-image-5.13.0-20-generic
linux-image-5.13.0-21-generic
linux-image-5.13.0-22-generic
linux-image-5.13.0-23-generic
linux-image-5.13.0-25-generic
linux-image-5.13.0-27-generic
linux-image-5.13.0-28-generic
linux-image-5.13.0-30-generic

9 kernels, that’s not too bad. All of them 5.13.0-XX-generic. So I just installed all the kernels:

$ sudo apt install --yes \
    linux-{image,headers,modules,modules-extra,tools}-5.13.0-*-generic
One Eternity Later

My /boot directory is quite busy now:

$  ls -hl /boot
total 1,2G
drwxr-xr-x  4 root root  12K mrt  1 18:11 .
drwxr-xr-x 20 root root 4,0K mrt  1 18:11 ..
-rw-r--r--  1 root root 252K okt  7 11:09 config-5.13.0-19-generic
-rw-r--r--  1 root root 252K okt 15 15:53 config-5.13.0-20-generic
-rw-r--r--  1 root root 252K okt 19 10:41 config-5.13.0-21-generic
-rw-r--r--  1 root root 252K nov  5 10:21 config-5.13.0-22-generic
-rw-r--r--  1 root root 252K nov 26 12:14 config-5.13.0-23-generic
-rw-r--r--  1 root root 252K jan  7 16:16 config-5.13.0-25-generic
-rw-r--r--  1 root root 252K jan 12 15:43 config-5.13.0-27-generic
-rw-r--r--  1 root root 252K jan 13 18:13 config-5.13.0-28-generic
-rw-r--r--  1 root root 252K feb  4 17:40 config-5.13.0-30-generic
drwx------  4 root root 4,0K jan  1  1970 efi
drwxr-xr-x  5 root root 4,0K mrt  1 18:11 grub
lrwxrwxrwx  1 root root   28 feb 28 04:26 initrd.img -> initrd.img-5.13.0-22-generic
-rw-r--r--  1 root root  40M mrt  1 16:02 initrd.img-5.13.0-19-generic
-rw-r--r--  1 root root  40M mrt  1 17:39 initrd.img-5.13.0-20-generic
-rw-r--r--  1 root root  40M mrt  1 17:38 initrd.img-5.13.0-21-generic
-rw-r--r--  1 root root  40M feb 26 13:55 initrd.img-5.13.0-22-generic
-rw-r--r--  1 root root  40M mrt  1 17:40 initrd.img-5.13.0-23-generic
-rw-r--r--  1 root root  40M mrt  1 17:40 initrd.img-5.13.0-25-generic
-rw-r--r--  1 root root  40M mrt  1 17:41 initrd.img-5.13.0-27-generic
-rw-r--r--  1 root root  40M mrt  1 17:41 initrd.img-5.13.0-28-generic
-rw-r--r--  1 root root  40M mrt  1 17:38 initrd.img-5.13.0-30-generic
-rw-------  1 root root 5,7M okt  7 11:09 System.map-5.13.0-19-generic
-rw-------  1 root root 5,7M okt 15 15:53 System.map-5.13.0-20-generic
-rw-------  1 root root 5,7M okt 19 10:41 System.map-5.13.0-21-generic
-rw-------  1 root root 5,7M nov  5 10:21 System.map-5.13.0-22-generic
-rw-------  1 root root 5,7M nov 26 12:14 System.map-5.13.0-23-generic
-rw-------  1 root root 5,7M jan  7 16:16 System.map-5.13.0-25-generic
-rw-------  1 root root 5,7M jan 12 15:43 System.map-5.13.0-27-generic
-rw-------  1 root root 5,7M jan 13 18:13 System.map-5.13.0-28-generic
-rw-------  1 root root 5,7M feb  4 17:40 System.map-5.13.0-30-generic
lrwxrwxrwx  1 root root   25 feb 28 04:27 vmlinuz -> vmlinuz-5.13.0-22-generic
-rw-------  1 root root 9,8M okt  7 19:37 vmlinuz-5.13.0-19-generic
-rw-------  1 root root 9,8M okt 15 15:56 vmlinuz-5.13.0-20-generic
-rw-------  1 root root 9,8M okt 19 10:43 vmlinuz-5.13.0-21-generic
-rw-------  1 root root 9,8M nov  5 13:51 vmlinuz-5.13.0-22-generic
-rw-------  1 root root 9,8M nov 26 11:52 vmlinuz-5.13.0-23-generic
-rw-------  1 root root 9,8M jan  7 16:19 vmlinuz-5.13.0-25-generic
-rw-------  1 root root 9,8M jan 12 16:19 vmlinuz-5.13.0-27-generic
-rw-------  1 root root 9,8M jan 13 18:10 vmlinuz-5.13.0-28-generic
-rw-------  1 root root 9,8M feb  4 17:46 vmlinuz-5.13.0-30-generic

I tried all these kernels. The last kernel where my input devices still worked, was 5.13.0-22-generic, and the first where they stopped working, was 5.13.0-23-generic. Which leads me to assume that some unintended change was introduced between those two versions, and it hasn’t been fixed since.

For now, I’m telling Ubuntu to keep kernel 5.13.0-22-generic and not upgrade to a more recent version.

$ sudo apt-mark hold linux-image-5.13.0-22-generic
linux-image-5.13.0-22-generic set on hold.

I also want Grub to show me the known working kernel as the default change. To do that, I’ve put this in /etc/default/grub:

GRUB_DEFAULT="Advanced options for Ubuntu>Ubuntu, with Linux 5.13.0-22-generic"

followed by sudo update-grub.

I’ll do the following things next, to get to the bottom of this:

microphotography of orange and blue house miniature on brown snail s back

Moving!

A few months ago I wrote about my preferred region to work. Well, that’s no longer true. The co-housing project where I live (in Merelbeke, near Ghent) is going to end, and I need to move by the end of July 2022.

This also has an influence on my preferred place to work. I have decided to find a place to live not too far from work, wherever that may be (because I’m still on the #jobhunt). Ideally it would be inside the triangle Ghent-Antwerp-Brussels but I think I could even be convinced by the Leuven area.

Factors I’ll take into account:

  • Elevation and hydrology – with climate change, I don’t want to live somewhere with increased risk of flooding.
  • Proximity of essential shops and services.
  • Proximity of public transport with a good service.
  • Proximity of car sharing services like Cambio.
  • Not too far from something green (a park will do just fine) to go for a walk or a run.

I haven’t started looking yet, I’m not even sure if I want to do co-housing again, or live on my own. That’ll depend on the price, I guess. (Living alone? In this economy???) First I want to land on a job.

That makes sense—without knowing where I will be working, house hunting feels a bit like putting the cart before the horse. Still, I find myself browsing listings occasionally, more out of curiosity than anything else. It is interesting to see how prices and availability vary wildly, even within the triangle I mentioned. Some towns look charming on paper but lack the basics I need; others tick all the boxes but come with a rental price that makes my eyebrows do gymnastics.

In the meantime, I am mentally preparing for a lot of change. Leaving my current co-housing situation is bittersweet. On one hand, it has been a wonderful experience: shared dinners, spontaneous conversations, and a real sense of community. On the other hand, living with others also means compromise, and part of me wonders what it would be like to have a space entirely to myself. No shared fridges, no waiting for the bathroom, and the joy of decorating a place to my own taste.

That said, co-housing still appeals to me. If I stumble upon a like-minded group or an interesting project in a new city, I would definitely consider it. The key will be finding something that balances affordability, autonomy, and connection. I do not need a commune, but I also do not want to feel isolated.

I suppose this transition is about more than just logistics—it is also a moment to rethink what I want day-to-day life to look like. Am I willing to commute a bit longer for a greener environment? Would I trade square meters for access to culture and nightlife? Do I want to wake up to birdsong or the rumble of trams?

These are the questions swirling around my head as I polish up my CV, send out job applications, and daydream about future homes. It is a lot to juggle, but oddly enough, I feel optimistic. This is a chance to design a new chapter from scratch. A little daunting, sure. But also full of possibility.

mailboxes on metal fence

How I organize my message flow

Email

I use 2 email clients at the same time: Thunderbird and Gmail.

  • Thunderbird: runs on my local system, it’s very fast, it shows me all the metadata of an email in the way I want, the email list is not paged, I can use it for high volume actions on email. These happen on my local system, and then the IMAP protocol gradually syncs it to Gmail. I also find that Thunderbird’s email notifications integrate nicer in Ubuntu.
  • Gmail: can’t be beaten for search. It also groups mail conversations. And then there are labels!
How to turn on Conversation View in Gmail

Gmail has several tabs: Primary, Social, Promotions, Updates and Forums. Gmail is usually smart enough that it can classify most emails in the correct tab. If it doesn’t: drag the email to the correct tab, and Gmail will ask you if all future emails of that sender should go to the same tab. This system works well enough for me. My email routine is to first check the tabs Social, Promotions and Forums, and delete or unsubscribe from most emails that end up there. All emails about the #jobhunt go to Updates. I clean up the other emails in that tab (delete, unsubscribe, filter, archive) so that only the #jobhunt emails remain. Those I give a label – more about that later. Then I go to the Inbox. Any emails there (shouldn’t be many) are also taken care of: delete, unsubscribe, filter, archive or reply.

Enable Gmail tabs
Gmail tabs

Google has 3 Send options: regular Send, Schedule send (which I don’t use) and Send + Archive. The last one is probably my favorite button. When I reply to an email, it is in most cases a final action on that item, so after the email is sent, it’s dealt with, and I don’t need to see it in my Inbox any more. And if there is a reply on the email, then the entire conversation will just go to the Inbox again (unarchived).

Send + Archive

I love labels! At the level of an individual email, you can add several labels. The tabs are also labels, so if you add the label Inbox to an archived email, then it will be shown in the Inbox again. At the level of the entire mailbox, labels behave a bit like mail folders. You can even have labels within labels, in a directory structure. Contrary to traditional mail clients, where an email could only be in one mail folder, you can add as many labels as you want.
The labels are also shown as folders in an IMAP mail client like Thunderbird. If you move an email from one folder to another, then the corresponding label gets updated in Gmail.
The filters that I use in my #jobhunt are work/jobhunt, work/jobhunt/call_back, work/jobhunt/not_interesting, work/jobhunt/not_interesting/freelance, work/jobhunt/not_interesting/abroad, work/jobsites and work/coaching. The emails that end up with the abroad label, are source material for my blog post Working Abroad?

The label list on the left looks like a directory structure. It’s actually a mix of labels and traditional folders like Sent, Drafts, Spam, Trash,… Those are always visible at the top. Then there is a neat little trick for labels. If you have a lot of labels, like me, then Gmail will hide some of them behind a “More” button. You can influence which labels are always visible by selecting Show if unread on that label. This only applies to top-level labels. When there are no unread emails with that label or any of it’s sublabels, then the label will be hidden below the More button. As soon as there are unread mails with that label or any of it’s sublabels, then the label will be visible. Mark all mails as read, and the label is out of view. Again, less clutter, you only see it when you need it.

Show if unread

Filters, filters, filters. I think I have a gazillion filters. (208, actually – I exported them to XML so I could count them) Each time I have more than two emails that have something meaningful in common, I make a filter. Most of my filters have the setting ‘Skip Inbox’. They will remain unread in the label where I put them, and I’ll read them when it’s convenient for me. For example, emails that are automatically labelled takeaway aren’t important and don’t need to be in the Inbox, but when I want to order takeaway, I take a look in that folder to see if there are any promo codes.

Email templates. Write a draft email, click on the 3 dots bottom right, save draft as template. Now I can reuse the same text so that I don’t have to write for the umpteenth time that I don’t do freelance. I could send an autoreply with templates, but for now I’ll still do it manually.

LinkedIn

I can be short about that: it’s a mess. You can only access LinkedIn messages from the website, and if you have a lot of messages, then it behaves like a garbage pile. Some people also have an expectation that it’s some sort of instant messaging. For me it definitely isn’t. And just like with email: I archive LinkedIn chats as soon as I have replied.

I used to have an autoreply that told people to email me, and gave a link to my CV and my blog. What do you think, should I enable that again?

vibrant jester figure in dramatic lighting

Reduce unit tests boilerplate with Jest’s .each syntax

When writing unit tests, especially in JavaScript/TypeScript with Jest, you often run into a common problem: repetition.
Imagine testing a function with several input-output pairs. The tests can become bloated and harder to read.
This is where Jest’s .each syntax shines. It lets you write cleaner, data-driven tests with minimal duplication.

The Problem: Repetitive Test Cases

Take a simple sum function:

function sum(a, b) {
  return a + b;
}

Without .each, you might write your tests like this:

test('adds 1 + 2 to equal 3', () => {
  expect(sum(1, 2)).toBe(3);
});

test('adds 2 + 3 to equal 5', () => {
  expect(sum(2, 3)).toBe(5);
});

test('adds -1 + -1 to equal -2', () => {
  expect(sum(-1, -1)).toBe(-2);
});

These tests work, but they are verbose. You repeat the same logic over and over with only the inputs and expected results changing.

The Solution: Jest’s .each Syntax

Jest’s .each allows you to define test cases as data and reuse the same test body.
Here is the same example using .each:

describe('sum', () => {
  test.each([
    [1, 2, 3],
    [2, 3, 5],
    [-1, -1, -2],
  ])('returns %i when %i + %i', (a, b, expected) => {
    expect(sum(a, b)).toBe(expected);
  });
});

This single block of code replaces three separate test cases.
Each array in the .each list corresponds to a test run, and Jest automatically substitutes the values.

Bonus: Named Arguments with Tagged Template Literals

You can also use named arguments for clarity:

test.each`
  a    | b    | expected
  ${1} | ${2} | ${3}
  ${2} | ${3} | ${5}
  ${-1}| ${-1}| ${-2}
`('returns $expected when $a + $b', ({ a, b, expected }) => {
  expect(sum(a, b)).toBe(expected);
});

This syntax is more readable, especially when dealing with longer or more descriptive variable names.
It reads like a mini table of test cases.

Why Use .each?

  • Less boilerplate: Define the test once and reuse it.
  • Better readability: Data-driven tests are easier to scan.
  • Easier maintenance: Add or remove cases without duplicating test logic.
  • Fewer mistakes: Repeating the same code invites copy-paste errors.

Use Case: Validating Multiple Inputs

Suppose you are testing a validation function like isEmail. You can define all test cases in one place:

test.each([
  ['user@example.com', true],
  ['not-an-email', false],
  ['hello@world.io', true],
  ['@missing.local', false],
])('validates %s as %s', (input, expected) => {
  expect(isEmail(input)).toBe(expected);
});

This approach scales better than writing individual test blocks for every email address.

Conclusion

Jest’s .each is a powerful way to reduce duplication in your test suite.
It helps you write cleaner, more maintainable, and more expressive tests.
Next time you find yourself writing nearly identical test cases, reach for .each—your future self will thank you.

My take on the Gilded Rose kata

The Gilded Rose Kata by Emily Bache is a staple in refactoring exercises. It offers a deceptively simple problem: refactor an existing codebase while preserving its behavior. I recently worked through the TypeScript version of the kata, and this post documents the transformation from a legacy mess into clean, testable code—with examples along the way.

But before diving into the code, I should mention: this was my very first encounter with TypeScript. I had never written a single line in the language before this exercise. That added an extra layer of learning—on top of refactoring legacy code, I was also picking up TypeScript’s type system, syntax, and tooling from scratch.


🧪 Development Workflow

Pre-Commit Hooks

pre-commit.com is a framework for managing and maintaining multi-language pre-commit hooks. It allows you to define a set of checks (such as code formatting, linting, or security scans) that automatically run before every commit, helping ensure code quality and consistency across a team. Hooks are easily configured in a .pre-commit-config.yaml file and can be reused from popular repositories or custom scripts. It integrates seamlessly with Git and supports many languages and tools out of the box.

I added eslint and gitlint:

- repo: https://github.com/pre-commit/mirrors-eslint
  hooks:
    - id: eslint

  - repo: https://github.com/jorisroovers/gitlint
    hooks:
      - id: gitlint

GitHub Actions

GitHub Actions was used to automate the testing workflow, ensuring that every push runs the full test suite. This provides immediate feedback when changes break functionality, which was especially important while refactoring the legacy Gilded Rose code. The setup installs dependencies with npm, runs tests with yarn, and ensures consistent results across different environments—helping maintain code quality and giving confidence to refactor freely while learning TypeScript.

name: Build

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:
  build:
    runs-on: ubuntu-latest

    strategy:
      matrix:
        node-version: [12.x]

    steps:
      - uses: actions/checkout@v2
      - name: Node.js
        uses: actions/setup-node@v1
        with:
          node-version: ${{ matrix.node-version }}
      - run: npm install -g yarn
        working-directory: ./TypeScript
      - name: yarn install, compile and test
        run: |
          yarn
          yarn compile
          yarn test
        working-directory: ./TypeScript

🔍 Starting Point: Legacy Logic

Originally, everything was handled in a massive updateQuality() function using nested if statements like this:

if (item.name !== 'Aged Brie' && item.name !== 'Backstage passes') {
    if (item.quality > 0) {
        item.quality--;
    }
} else {
    if (item.quality < 50) {
        item.quality++;
    }
}

The function mixed different concerns and was painful to extend.


🧪 Building Safety Nets

Golden master tests are a technique used to protect legacy code during refactoring by capturing the current behavior of the system and comparing it against future runs. In this project, I recorded the output of the original updateQuality() function across many item variations. As changes were made to clean up and restructure the logic, the tests ensured that the external behavior remained identical. This approach was especially useful when the codebase was poorly understood or untested, offering a reliable safety net while improving internal structure.

expect(goldenMasterOutput).toEqual(currentOutput);

🧹 Refactoring: Toward Structure and Simplicity

1. Extracting Logic

I moved logic to a separate method:

private doUpdateQuality(item: Item) {
    // clean, focused logic
}

This isolated the business rules from boilerplate iteration.

2. Replacing Conditionals with a switch

Using a switch statement instead of multiple if/else if blocks makes the code cleaner, more readable, and easier to maintain—especially when checking a single variable (like item.name) against several known values. It clearly separates each case, making it easier to scan and reason about the logic. In the Gilded Rose project, switching to switch also made it easier to later refactor into specialized handlers or classes for each item type, as each case represented a clear and distinct behavior to isolate.

switch (item.name) {
    case 'Aged Brie':
        this.updateBrie(item);
        break;
    case 'Sulfuras':
        break; // no-op
    case 'Backstage passes':
        this.updateBackstage(item);
        break;
    default:
        this.updateNormal(item);
}

This increased clarity and prepared the ground for polymorphism or factory patterns later.


🛠 Polishing the Code

Constants and Math Utilities

Instead of magic strings and numbers, I introduced constants:

const MAX_QUALITY = 50;
const MIN_QUALITY = 0;

I replaced verbose checks with:

item.quality = Math.min(MAX_QUALITY, item.quality + 1);

Factory Pattern

The factory pattern is a design pattern that creates objects without exposing the exact class or construction logic to the code that uses them. Instead of instantiating classes directly with new, a factory function or class decides which subclass to return based on input—like item names in the Gilded Rose kata. This makes it easy to add new behaviors (e.g., “Conjured” items) without changing existing logic, supporting the Open/Closed Principle and keeping the code modular and easier to test or extend.

switch (true) {
    case /^Conjured/.test(item.name):
        return new ConjuredItem(item);
    case item.name === 'Sulfuras':
        return new SulfurasItem(item);
    // ...
}

🌟 Feature Additions

With structure in place, adding Conjured Items was straightforward:

class ConjuredItem extends ItemUpdater {
    update() {
        this.decreaseQuality(2);
        this.decreaseSellIn();
    }
}

A corresponding test was added to confirm behavior.


🎯 Conclusion

The journey from legacy to clean architecture was iterative and rewarding. Key takeaways:

  • Set up CI and hooks early to enforce consistency.
  • Use golden master tests for safety.
  • Start small with extractions and switch statements.
  • Add structure gradually—factories, constants, classes.
  • With a clean base, adding features like “Conjured” is trivial.

All this while learning TypeScript for the first time!

You can explore the full codebase and history here:
📦 Gilded Rose Refactoring Kata — TypeScript branch

Curious to try it yourself, also in other languages?
Fork Emily Bache’s repo here: GildedRose-Refactoring-Kata on GitHub

green snake

A small rant about dependencies (and a promise)

Every now and then I run into some awesome open source project on GitHub, that is written in some cool programming language, and it assumes that the development tools for that language are already installed. My assumption is that they have a specific target audience in mind: an already existing developer community around that specific language. People who already have those tools installed.

The annoying thing is when someone like me, who doesn’t really need to know if a thing is written in Python or Ruby or JavaScript or whatever, tries to follow instructions like these:

$ pip install foo
Command 'pip' not found
$ gem install bar
Command 'gem' not found
$ yarn install baz
Command 'yarn' not found
$ ./configure && make && sudo make install
Command 'make' not found

By now, I already know that I first need to do sudo apt install python3-pip (or the equivalent installation commands for RubyGems, Yarn, build-essential,…). I also understand that, within the context of a specific developer community, this is so obvious that it is often assumed. That being said, I am making a promise:

For every open source project that I will henceforth publish online (on Github or any other code sharing platforms), I promise to do the following things:
(1) Test the installation on at least one clean installed operating system – which will be documented.
(2) Include full installation steps in the documentation, including all frameworks, development tools, etc. that would otherwise be assumed.
(3) Where possible and useful, provide an installation script.

The operating system I’m currently targeting, is Ubuntu, which means I’ll include apt commands. I’m counting on Continuous Integration to help me test on other operating systems that I don’t personally use.

Jag lär mig spela nyckelharpa

In 2016 I did something unexpected: I picked up a nyckelharpa for the very first time.

Jag hade aldrig spelat ett instrument “på riktigt” tidigare. Visst, jag spelade blockflöjt i skolan – men jag var usel på det och hatade varje minut. So when I started learning nyckelharpa, it was a fresh beginning, a clean slate.

Varför nyckelharpa?

One of the biggest reasons I got interested in the nyckelharpa is because I love to dance – especially balfolk, and even more so the Swedish polska. Det började alltså med dansen. Jag lyssnade på mycket polska, och snart märkte jag att många av mina favoritlåtar spelades på nyckelharpa. Before I knew it, I wanted to try playing them myself.

Vad är en nyckelharpa?

A nyckelharpa is a traditional Swedish keyed fiddle. It has strings that you play with a bow, and instead of pressing the strings directly, you use wooden keys that stop the string at the correct pitch. Det ger en väldigt speciell klang – varm, vibrerande och nästan magisk. Jag blev förälskad i ljudet direkt.

Mina första steg

Jag började ta lektioner på musikskolan i Schoten, Belgien, där min lärare är Ann Heynen. Sedan dess har jag deltagit i många helgkurser och workshops i Belgien, Tyskland, Nederländerna och Storbritannien.
(Jag har inte varit i Sverige för kurser – ännu! Men det finns på min önskelista.)

Det var där jag fick lära mig av några av de mest inspirerande spelmän och -kvinnor jag någonsin träffat:
Jule Bauer, Magnus Holmström, Emilia Amper, Marco Ambrosini, Didier François, Josefina Paulson, Vicki Swan, David Eriksson, Olena Yeremenko, Björn Kaidel, Olov Johansson, Eléonore Billy, Johannes Mayr, Johan Lång, Alban Faust, Koen Vanmeerbeek, Eveline d’Hanens – och säkert många fler fantastiska musiker jag glömmer just nu.

Under kurserna har jag också fått många nya bekanta – och till och med riktiga vänner – från hela Europa.
We share the same passion for music, dancing, and culture, and it is amazing how the nyckelharpa can bring people together across borders.

Från hyra till egen nyckelharpa

Like many beginners, I started by renting an instrument. Men i 2019 kände jag att det var dags att ta nästa steg, och jag beställde min egen nyckelharpa från Jean-Claude Condi, en lutier i Mirecourt, Frankrike – ett historiskt centrum för instrumentbyggare.

Tyvärr slog pandemin till strax efter, och det dröjde ända till augusti 2021 innan jag kunde åka till Mirecourt och äntligen hämta min nyckelharpa. It was worth the wait.

En resa i både musik och språk

Att lära mig spela nyckelharpa väckte också mitt intresse för svensk kultur. I kept hearing Swedish in the songs, and in 2020, I finally decided to start learning the language.
Jag började läsa svenska på kvällsskola under läsåret, och under loven fortsatte jag att öva med Duolingo. Sedan dess har jag försökt kombinera mina två passioner: språket och musiken.

Jag lyssnar ofta på svenska låtar, spelar visor och folkmelodier, och ibland försöker jag sjunga med. It is not only a way to practice, it is also incredibly rewarding.

Spela för dans

One of my goals is to be able to play well enough that others can dance to my music – just like I love dancing to other people’s tunes.
Det är inte lätt, för när jag har lärt mig en låt utantill, har jag redan glömt hur den förra gick… Men jag fortsätter öva. En dag, så!

Vad händer härnäst?

Mitt mål är att en dag spela tillsammans med andra på en riktig spelmansstämma i Sverige – och kanske äntligen ta en kurs på plats i Sverige också.
Men fram till dess fortsätter jag att öva, att lära mig, och att njuta av varje ton.

Jag lär mig spela nyckelharpa. Och jag lär mig svenska. Två passioner, ett hjärta. ❤️


🎶 Vill du också börja?

Är du nyfiken på nyckelharpa? Eller kanske du dansar balfolk och vill kunna spela själv?
Do not wait as long as I did — rent an instrument, find a workshop, or try your first tune today.
And if you are already playing: hör gärna av dig! Let us jam, dance, or just talk nyckelharpa.