Ubuntu mainline kernel packages 5.15.7 and later bump a dependency from libssl1.1 (>= 1.1.0) to libssl3 (>= 3.0.0~~alpha1).
However, package libssl3 is not available for Ubuntu 21.10 Impish Indri. It’s only available for Ubuntu 22.04 Jammy Jellyfish (which is still in beta as of time of writing) and later.
libssl3 further depends on libc6>=2.34 and debconf, but they are available in 21.10 repositories.
Here are a few different ways to resolve the dependency:
Option 1
Use apt pinning to install libssl3 from a Jammy repo, without pulling in everything else from Jammy.
This is more complicated, but it allows the libssl3 package to receive updates automatically. Do all the following as root.
Create an apt config file to specify your system’s current release as the default release for installing packages, instead of simply the highest version number found. We are about to add a Jammy repo to apt, which will contain a lot of packages with higher version numbers, and we want apt to ignore them all.
$ echo 'APT::Default-Release "impish";' \
| sudo tee /etc/apt/apt.conf.d/01ubuntu
Add the Jammy repository to the apt sources. If your system isn’t “impish”, change that below.
$ awk '($1$3$4=="debimpishmain"){$3="jammy" ;print}' /etc/apt/sources.list \
| sudo tee /etc/apt/sources.list.d/jammy.list
Pin libssl3 to the jammy version in apt preferences. This overrides the Default-Release above, just for the libssl3 package.
This only works if there aren’t any additional dependencies, which you would also have to install, with a risk of breaking your system. Here Be Dragons…
Recently I wanted to print some PDF files containing sheet music. The tedious way to do that, would be to open them one by one in Evince and press the print button. Surely there must be a more efficient way to do that?
There are 2 console commands for printing: lp and lpr. One comes from grandpa System V, the other from grandpa BSD, and both are included in CUPS. The nice thing about these commands is that they know how to interpret PostScript and PDF files. So this is going to be easy: just cd into the directory with the PDF files and print them all:
$ lp *.pdf
lp: Error - No default destination.
Oops. A quick Google search of this error message tells me that I don’t have a default printer.
printer HP_OfficeJet_Pro_9010_NETWORK is idle. enabled since za 12 mrt 2022 00:00:28
printer HP_OfficeJet_Pro_9010_USB is idle. enabled since za 12 mrt 2022 00:00:17
no system default destination
I have a HP OfficeJet Pro 9012e printer, which Ubuntu recognizes as a 9010 series. Close enough. It’s connected over network and USB. I’m setting the network connection as default with lpoptions:
I can then use lpq to verify that the default printer is ready:
$ lpq
HP_OfficeJet_Pro_9010_NETWORK is ready
no entries
Printing multiple files from console
I found that if I naively do lp *.pdf, then only the last file will be printed. That’s unexpected, and I can’t be bothered to find out why. So I just use ls and feed that to a while-loop. It’s quick and dirty, and using find+xargs would probably be better if there are “special” characters, but that’s not the case here.
There’s one caveat: when the PDF files are printed one by one, then the first page will be at the bottom of the paper stack, so I need to print them in reverse order.
$ ls --reverse *.pdf | while read f; do lp "$f"; done
With that command I got 17 print jobs in the printer queue, one for each file.
Now that I know how to print from console, I’ll probably do that more often. The man page of lp describes many useful printing options, like printing double sided:
Now that I have a way to compile kernels from source, I want to find the exact commit where my input devices stop working. That means doing a git checkout of a certain commit, build the kernel, install it, reboot, select the new kernel in Grub, and see if my keyboard works. I am quite sure that I need to search between 5.13.0-22 and 5.13.0-23, but that’s still 634 commits!
This is where git bisect comes in. It’s sort of a wizard that guides you to find a bad commit. You tell it on which commit your software was known to work ok, and a commit where it doesn’t. It then picks a commit somewhere in the middle, you build your software and do your tests, and then tell git bisect if the result was good or bad. It will then give you a new commit to inspect, each time narrowing the search.
git bisect
Let’s do this!
$ git bisect start
$ git bisect good Ubuntu-5.13.0-22.22
$ git bisect bad Ubuntu-5.13.0-23.23
Bisecting: 316 revisions left to test after this (roughly 8 steps)
[398351230dab42d654036847a49a5839705abdcb] powerpc/bpf ppc32: Fix BPF_SUB when imm == 0x80000000
$ git describe --long
Ubuntu-5.13.0-22.22-317-g398351230dab
In this first step, I get the 317th commit after 5.13.0-22. Let’s compile that commit:
$ time make clean olddefconfig bindeb-pkg \
--jobs=$(getconf _NPROCESSORS_ONLN) \
LOCALVERSION=-$(git describe --long | tr '[:upper:]' '[:lower:]')
This creates 3 .deb packages in the directory above:
Now I can reboot, select the new kernel in Grub, and test the keyboard. With commit 317, the keyboard worked, so the first bad commit has to be somewhere between commit 317 and 634:
$ git bisect good ; git describe --long
Bisecting: 158 revisions left to test after this (roughly 7 steps)
[79b62d0bba892e8367cb46ca09b623c885852c29] drm/msm/a4xx: fix error handling in a4xx_gpu_init()
Ubuntu-5.13.0-22.22-475-g79b62d0bba89
Now it’s time again for make clean olddefconfig bindeb-pkg, dpkg --install and reboot. Turns out that commit 475 was a “bad” commit (one where the keyboard didn’t work):
$ git bisect bad ; git describe --long
Bisecting: 78 revisions left to test after this (roughly 6 steps)
[c3d35f3acc3a11b726959c7b2c25ab9e46310273] USB: serial: option: add Telit LE910Cx composition 0x1204
Ubuntu-5.13.0-22.22-396-gc3d35f3acc3a
I’m not going to describe all the steps in full detail, by now you should get the gist of it. This was the sequence of steps that git bisect gave me:
$ git bisect bad ; git describe --long
0fc979747dece96c189bc29ef604e61afbddfa2a is the first bad commit
commit 0fc979747dece96c189bc29ef604e61afbddfa2a
Author: Pavankumar Kondeti <pkondeti@codeaurora.org>
Date: Fri Oct 8 12:25:46 2021 +0300
xhci: Fix command ring pointer corruption while aborting a command
BugLink: https://bugs.launchpad.net/bugs/1951880
commit ff0e50d3564f33b7f4b35cadeabd951d66cfc570 upstream.
The command ring pointer is located at [6:63] bits of the command
ring control register (CRCR). All the control bits like command stop,
abort are located at [0:3] bits. While aborting a command, we read the
CRCR and set the abort bit and write to the CRCR. The read will always
give command ring pointer as all zeros. So we essentially write only
the control bits. Since we split the 64 bit write into two 32 bit writes,
there is a possibility of xHC command ring stopped before the upper
dword (all zeros) is written. If that happens, xHC updates the upper
dword of its internal command ring pointer with all zeros. Next time,
when the command ring is restarted, we see xHC memory access failures.
Fix this issue by only writing to the lower dword of CRCR where all
control bits are located.
Cc: stable@vger.kernel.org
Signed-off-by: Pavankumar Kondeti <pkondeti@codeaurora.org>
Signed-off-by: Mathias Nyman <mathias.nyman@linux.intel.com>
Link: https://lore.kernel.org/r/20211008092547.3996295-5-mathias.nyman@linux.intel.com
Signed-off-by: Greg Kroah-Hartman <gregkh@linuxfoundation.org>
Signed-off-by: Kamal Mostafa <kamal@canonical.com>
Signed-off-by: Stefan Bader <stefan.bader@canonical.com>
drivers/usb/host/xhci-ring.c | 14 ++++++++++----
1 file changed, 10 insertions(+), 4 deletions(-)
Ubuntu-5.13.0-22.22-387-g0fc979747dec
At first sight the commit description is quite cryptic, and the actual code change doesn’t tell me a lot either. But it’s a change in drivers/usb/host/xhci-ring.c, and xhci stands for eXtensible Host Controller Interface, and interface specification for USB host controllers. If it’s an issue with the USB host controller, then it makes sense that if I use 2 keyboards from different brands, neither of them would work. It also suggests that other USB devices, like external hard drives, wouldn’t work either, but that’s a bit harder to test. A keyboard is easy. Just look at NumLock LED, if it doesn’t go on then there’s an issue.
The first link in the commit description is just a long list of patches that were taken from upstream and integrated in the Ubuntu kernel, so that doesn’t help me. The second link is a thread on the kernel.org mailing list, and there it gets interesting.
kernel.org mailing list thread
Some excerpts from the thread:
This patch cause suspend to disk resume usb not work, xhci_hcd 0000:00:14.0: Abort failed to stop command ring: -110.
youling257
Thanks for the report, this is odd. Could you double check that by reverting this patch resume start working again. If this is the case maybe we need to write all 64bits before this xHC hardware reacts to CRCR register changes. Maybe following changes on top of current patch could help:
Mathias Nyman
Every time a developer says “this is odd”, my alarm bells go off. 😀
Further down in the thread there is a proposed update to the change. I’m going to try that patch, but that’s for another blog post.
Kernel ubuntu-5.13.0-22.22-0-g3ab15e228151 is, for all intents and purposes, the same as kernel 5.13.0-22-generic, so I expected it to be a “good” kernel, and it was.
For kernel Ubuntu-5.13.0-23.23 I did the same thing: starting from the git checkout. I skipped copying and editing the config file, because between minor releases I don’t expect there to be much change. I did run make olddefconfig for good measure, though. As expected, the keyboard and mouse didn’t work with the compiled ...-23 kernel.
Next up: using git bisect to find the exact commit where it went wrong. It’s got to be somewhere between ...-22 and ...-23!
As I wrote previously, I’m suspecting a Linux kernel bug somewhere between versions 5.13.0-22 and 5.13.0-23, in the Ubuntu kernels. I wanted to know if the issue only surfaced in Ubuntu-flavored kernels, or also in the upstream (mainline) kernels from kernel.org.
There is an Ubuntu Mainline PPA with all the upstream kernels, but I found it a bit too opaque to use. Fortunately I found the Ubuntu Mainline Kernel Installer (UMKI), a tool for installing the latest Linux kernels on Ubuntu-based distributions.
Ubuntu Mainline Kernel Installer (UMKI)
The UMKI is pretty straightforward. It fetches a list of kernels from the Ubuntu Mainline PPA and a GUI displays available and installed kernels, regardless of how they were installed. It installs the kernel, headers and modules. There is also a CLI client.
With that out of the way, there’s the matter of deciding which kernels to try. The “interesting” Ubuntu kernels are 5.13.0-22 and 5.13.0-23, so the mainline kernels I definitely want to test, are around those versions. That means 5.13.0 and 5.13.1. I also want to try the latest 5.13.x kernel, so that’s 5.13.19, and the most recent stable kernel, 5.16.11 (as of 2022-03-01).
To summarize, I have tested these mainline kernels:
5.13.0
5.13.1
5.13.19
5.16.11
The result (after several reboots)? With all of them, my keyboard and mouse worked without a hitch. That means the issue most likely doesn’t occur in (stable) mainline kernels, only in kernels with additional patches from Ubuntu.
The operating system on my computer is Ubuntu Linux, version 21.10 (Impish Indri). Recently I had an issue that, after a kernel update (and reboot), my USB keyboard and mouse didn’t work any more in the login screen. Huh, that’s unexpected. The issue was:
At the Grub boot menu, the keyboard works: I can use the keys, the numlock led lights up, the LCD of the Logitech G19 displays a logo.
At the Ubuntu login screen, the keyboard (and the mouse) went dark: no backlight of the keys, no numlock led, no logo on the display. And the mouse cursor didn’t move on screen.
Must be a problem at my end, I initially thought, because surely, something so essential as input devices wouldn’t break by a simple kernel update? So I did some basic troubleshooting:
Have you tried to turn it off and on again?
Have you tried to turn it off and on again?
Plug the keyboard in another USB port.
Try a different keyboard.
Start with the older kernel, which was still in the Grub menu. And indeed, this gave me back control over my input devices!
So if the only thing I changed was the kernel, then maybe it’s a kernel bug after all?
I know that Ubuntu 21.10 uses kernel 5.something, and I know that I use the generic kernels. So which kernels are we talking about, actually?
$ apt-cache show linux-image-5*-generic | grep Package: | sed 's/Package: //g'
linux-image-5.13.0-19-generic
linux-image-5.13.0-20-generic
linux-image-5.13.0-21-generic
linux-image-5.13.0-22-generic
linux-image-5.13.0-23-generic
linux-image-5.13.0-25-generic
linux-image-5.13.0-27-generic
linux-image-5.13.0-28-generic
linux-image-5.13.0-30-generic
9 kernels, that’s not too bad. All of them 5.13.0-XX-generic. So I just installed all the kernels:
I tried all these kernels. The last kernel where my input devices still worked, was 5.13.0-22-generic, and the first where they stopped working, was 5.13.0-23-generic. Which leads me to assume that some unintended change was introduced between those two versions, and it hasn’t been fixed since.
For now, I’m telling Ubuntu to keep kernel 5.13.0-22-generic and not upgrade to a more recent version.
$ sudo apt-mark hold linux-image-5.13.0-22-generic
linux-image-5.13.0-22-generic set on hold.
I also want Grub to show me the known working kernel as the default change. To do that, I’ve put this in /etc/default/grub:
GRUB_DEFAULT="Advanced options for Ubuntu>Ubuntu, with Linux 5.13.0-22-generic"
followed by sudo update-grub.
I’ll do the following things next, to get to the bottom of this:
Compile kernels from source, to hopefully find the exact change that caused the USB input devices to stop working. git bisect helps a lot in narrowing down the broken commit.
A few months ago I wrote about my preferred region to work. Well, that’s no longer true. The co-housing project where I live (in Merelbeke, near Ghent) is going to end, and I need to move by the end of July 2022.
This also has an influence on my preferred place to work. I have decided to find a place to live not too far from work, wherever that may be (because I’m still on the #jobhunt). Ideally it would be inside the triangle Ghent-Antwerp-Brussels but I think I could even be convinced by the Leuven area.
Factors I’ll take into account:
Elevation and hydrology – with climate change, I don’t want to live somewhere with increased risk of flooding.
Proximity of essential shops and services.
Proximity of public transport with a good service.
Not too far from something green (a park will do just fine) to go for a walk or a run.
I haven’t started looking yet, I’m not even sure if I want to do co-housing again, or live on my own. That’ll depend on the price, I guess. (Living alone? In this economy???) First I want to land on a job.
That makes sense—without knowing where I will be working, house hunting feels a bit like putting the cart before the horse. Still, I find myself browsing listings occasionally, more out of curiosity than anything else. It is interesting to see how prices and availability vary wildly, even within the triangle I mentioned. Some towns look charming on paper but lack the basics I need; others tick all the boxes but come with a rental price that makes my eyebrows do gymnastics.
In the meantime, I am mentally preparing for a lot of change. Leaving my current co-housing situation is bittersweet. On one hand, it has been a wonderful experience: shared dinners, spontaneous conversations, and a real sense of community. On the other hand, living with others also means compromise, and part of me wonders what it would be like to have a space entirely to myself. No shared fridges, no waiting for the bathroom, and the joy of decorating a place to my own taste.
That said, co-housing still appeals to me. If I stumble upon a like-minded group or an interesting project in a new city, I would definitely consider it. The key will be finding something that balances affordability, autonomy, and connection. I do not need a commune, but I also do not want to feel isolated.
I suppose this transition is about more than just logistics—it is also a moment to rethink what I want day-to-day life to look like. Am I willing to commute a bit longer for a greener environment? Would I trade square meters for access to culture and nightlife? Do I want to wake up to birdsong or the rumble of trams?
These are the questions swirling around my head as I polish up my CV, send out job applications, and daydream about future homes. It is a lot to juggle, but oddly enough, I feel optimistic. This is a chance to design a new chapter from scratch. A little daunting, sure. But also full of possibility.
I use 2 email clients at the same time: Thunderbird and Gmail.
Thunderbird: runs on my local system, it’s very fast, it shows me all the metadata of an email in the way I want, the email list is not paged, I can use it for high volume actions on email. These happen on my local system, and then the IMAP protocol gradually syncs it to Gmail. I also find that Thunderbird’s email notifications integrate nicer in Ubuntu.
Gmail: can’t be beaten for search. It also groups mail conversations. And then there are labels!
How to turn on Conversation View in Gmail
Gmail has several tabs: Primary, Social, Promotions, Updates and Forums. Gmail is usually smart enough that it can classify most emails in the correct tab. If it doesn’t: drag the email to the correct tab, and Gmail will ask you if all future emails of that sender should go to the same tab. This system works well enough for me. My email routine is to first check the tabs Social, Promotions and Forums, and delete or unsubscribe from most emails that end up there. All emails about the #jobhunt go to Updates. I clean up the other emails in that tab (delete, unsubscribe, filter, archive) so that only the #jobhunt emails remain. Those I give a label – more about that later. Then I go to the Inbox. Any emails there (shouldn’t be many) are also taken care of: delete, unsubscribe, filter, archive or reply.
Enable Gmail tabsGmail tabs
Google has 3 Send options: regular Send, Schedule send (which I don’t use) and Send + Archive. The last one is probably my favorite button. When I reply to an email, it is in most cases a final action on that item, so after the email is sent, it’s dealt with, and I don’t need to see it in my Inbox any more. And if there is a reply on the email, then the entire conversation will just go to the Inbox again (unarchived).
Send + Archive
I love labels! At the level of an individual email, you can add several labels. The tabs are also labels, so if you add the label Inbox to an archived email, then it will be shown in the Inbox again. At the level of the entire mailbox, labels behave a bit like mail folders. You can even have labels within labels, in a directory structure. Contrary to traditional mail clients, where an email could only be in one mail folder, you can add as many labels as you want. The labels are also shown as folders in an IMAP mail client like Thunderbird. If you move an email from one folder to another, then the corresponding label gets updated in Gmail. The filters that I use in my #jobhunt are work/jobhunt, work/jobhunt/call_back, work/jobhunt/not_interesting, work/jobhunt/not_interesting/freelance, work/jobhunt/not_interesting/abroad, work/jobsites and work/coaching. The emails that end up with the abroad label, are source material for my blog post Working Abroad?
The label list on the left looks like a directory structure. It’s actually a mix of labels and traditional folders like Sent, Drafts, Spam, Trash,… Those are always visible at the top. Then there is a neat little trick for labels. If you have a lot of labels, like me, then Gmail will hide some of them behind a “More” button. You can influence which labels are always visible by selecting Show if unread on that label. This only applies to top-level labels. When there are no unread emails with that label or any of it’s sublabels, then the label will be hidden below the More button. As soon as there are unread mails with that label or any of it’s sublabels, then the label will be visible. Mark all mails as read, and the label is out of view. Again, less clutter, you only see it when you need it.
Show if unread
Filters, filters, filters. I think I have a gazillion filters. (208, actually – I exported them to XML so I could count them) Each time I have more than two emails that have something meaningful in common, I make a filter. Most of my filters have the setting ‘Skip Inbox’. They will remain unread in the label where I put them, and I’ll read them when it’s convenient for me. For example, emails that are automatically labelled takeaway aren’t important and don’t need to be in the Inbox, but when I want to order takeaway, I take a look in that folder to see if there are any promo codes.
Email templates. Write a draft email, click on the 3 dots bottom right, save draft as template. Now I can reuse the same text so that I don’t have to write for the umpteenth time that I don’t do freelance. I could send an autoreply with templates, but for now I’ll still do it manually.
LinkedIn
I can be short about that: it’s a mess. You can only access LinkedIn messages from the website, and if you have a lot of messages, then it behaves like a garbage pile. Some people also have an expectation that it’s some sort of instant messaging. For me it definitely isn’t. And just like with email: I archive LinkedIn chats as soon as I have replied.
I used to have an autoreply that told people to email me, and gave a link to my CV and my blog. What do you think, should I enable that again?
When writing unit tests, especially in JavaScript/TypeScript with Jest, you often run into a common problem: repetition. Imagine testing a function with several input-output pairs. The tests can become bloated and harder to read. This is where Jest’s .each syntax shines. It lets you write cleaner, data-driven tests with minimal duplication.
The Problem: Repetitive Test Cases
Take a simple sum function:
function sum(a, b) {
return a + b;
}
Without .each, you might write your tests like this:
This single block of code replaces three separate test cases. Each array in the .each list corresponds to a test run, and Jest automatically substitutes the values.
Bonus: Named Arguments with Tagged Template Literals
You can also use named arguments for clarity:
test.each`
a | b | expected
${1} | ${2} | ${3}
${2} | ${3} | ${5}
${-1}| ${-1}| ${-2}
`('returns $expected when $a + $b', ({ a, b, expected }) => {
expect(sum(a, b)).toBe(expected);
});
This syntax is more readable, especially when dealing with longer or more descriptive variable names. It reads like a mini table of test cases.
Why Use .each?
Less boilerplate: Define the test once and reuse it.
Better readability: Data-driven tests are easier to scan.
Easier maintenance: Add or remove cases without duplicating test logic.
Fewer mistakes: Repeating the same code invites copy-paste errors.
Use Case: Validating Multiple Inputs
Suppose you are testing a validation function like isEmail. You can define all test cases in one place:
This approach scales better than writing individual test blocks for every email address.
Conclusion
Jest’s .each is a powerful way to reduce duplication in your test suite. It helps you write cleaner, more maintainable, and more expressive tests. Next time you find yourself writing nearly identical test cases, reach for .each—your future self will thank you.
The Gilded Rose Kata by Emily Bache is a staple in refactoring exercises. It offers a deceptively simple problem: refactor an existing codebase while preserving its behavior. I recently worked through the TypeScript version of the kata, and this post documents the transformation from a legacy mess into clean, testable code—with examples along the way.
But before diving into the code, I should mention: this was my very first encounter with TypeScript. I had never written a single line in the language before this exercise. That added an extra layer of learning—on top of refactoring legacy code, I was also picking up TypeScript’s type system, syntax, and tooling from scratch.
🧪 Development Workflow
Pre-Commit Hooks
pre-commit.com is a framework for managing and maintaining multi-language pre-commit hooks. It allows you to define a set of checks (such as code formatting, linting, or security scans) that automatically run before every commit, helping ensure code quality and consistency across a team. Hooks are easily configured in a .pre-commit-config.yaml file and can be reused from popular repositories or custom scripts. It integrates seamlessly with Git and supports many languages and tools out of the box.
GitHub Actions was used to automate the testing workflow, ensuring that every push runs the full test suite. This provides immediate feedback when changes break functionality, which was especially important while refactoring the legacy Gilded Rose code. The setup installs dependencies with npm, runs tests with yarn, and ensures consistent results across different environments—helping maintain code quality and giving confidence to refactor freely while learning TypeScript.
name: Build
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x]
steps:
- uses: actions/checkout@v2
- name: Node.js
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- run: npm install -g yarn
working-directory: ./TypeScript
- name: yarn install, compile and test
run: |
yarn
yarn compile
yarn test
working-directory: ./TypeScript
🔍 Starting Point: Legacy Logic
Originally, everything was handled in a massive updateQuality() function using nested if statements like this:
if (item.name !== 'Aged Brie' && item.name !== 'Backstage passes') {
if (item.quality > 0) {
item.quality--;
}
} else {
if (item.quality < 50) {
item.quality++;
}
}
The function mixed different concerns and was painful to extend.
🧪 Building Safety Nets
Golden master tests are a technique used to protect legacy code during refactoring by capturing the current behavior of the system and comparing it against future runs. In this project, I recorded the output of the original updateQuality() function across many item variations. As changes were made to clean up and restructure the logic, the tests ensured that the external behavior remained identical. This approach was especially useful when the codebase was poorly understood or untested, offering a reliable safety net while improving internal structure.
This isolated the business rules from boilerplate iteration.
2. Replacing Conditionals with a switch
Using a switch statement instead of multiple if/else if blocks makes the code cleaner, more readable, and easier to maintain—especially when checking a single variable (like item.name) against several known values. It clearly separates each case, making it easier to scan and reason about the logic. In the Gilded Rose project, switching to switch also made it easier to later refactor into specialized handlers or classes for each item type, as each case represented a clear and distinct behavior to isolate.
switch (item.name) {
case 'Aged Brie':
this.updateBrie(item);
break;
case 'Sulfuras':
break; // no-op
case 'Backstage passes':
this.updateBackstage(item);
break;
default:
this.updateNormal(item);
}
This increased clarity and prepared the ground for polymorphism or factory patterns later.
🛠 Polishing the Code
Constants and Math Utilities
Instead of magic strings and numbers, I introduced constants:
The factory pattern is a design pattern that creates objects without exposing the exact class or construction logic to the code that uses them. Instead of instantiating classes directly with new, a factory function or class decides which subclass to return based on input—like item names in the Gilded Rose kata. This makes it easy to add new behaviors (e.g., “Conjured” items) without changing existing logic, supporting the Open/Closed Principle and keeping the code modular and easier to test or extend.
switch (true) {
case /^Conjured/.test(item.name):
return new ConjuredItem(item);
case item.name === 'Sulfuras':
return new SulfurasItem(item);
// ...
}
🌟 Feature Additions
With structure in place, adding Conjured Items was straightforward: