Today I want to pull out two ideas from Cameron Tonkinwise’s Service Design and Social Complexity (although it goes without saying that the essay itself has many more ideas I won’t have a chance to touch on in this format).
The first idea is a tidy formulation of what design is. Cameron talks about service design specifically, but this idea is broader: that there is complexity, and regularization, and we broadly associate design with the latter. Linux vs Apple. “Don’t make me think.” But designing is not solely regularizing:
To engage in complex discovery or creativity in certain domains of living, or working, it is necessary to off-load other domains of living or working to the regularity of what has been designed. The challenge is deciding what to regularize and what complexity to make space for.
The second idea is the keystone for this issue of the Picnic. Cameron doesn’t say it outright, but it jumps out at me from between the lines: process design is service design. The design process — hell, the entire product delivery lifecycle — is a structure that manages relationships of service between parties. The entire thing is governed by formal and informal agreements of exchange.
Or, it used to be. Because one of the highest-profile promises of the so-called AI era is that service relationships are for dopes, and you can opt out of them. People can be replaced entirely, or at least pushed to arm’s reach, by agents who will perform the same tasks for you.
Design undermined its own service relationships.
Not all service relationships are equally impacted. Those that were deliberately designed will endure. But ours were designed by circumstances; design was in the room because stakeholders needed someone to draw the pictures. We never managed to convince them otherwise; our appeals to craft and taste reinforced the notion that our value was that of an artist, not a strategist.
Design made itself very vulnerable to this disruption long before ChatGPT came about. We were already letting tools and templates think on our behalf. The AI part is just a small increment on top of a design practice already defined by dropping the right pattern in the right slot along the user journey:
AI-supported design software is a small and predictable step beyond the libraries of design patterns that have been the core value proposition of computer-aided design for decades.
Even today, the reflex of the designer is to demonstrate how good we are with these new tools. But stakeholders who didn’t already care about design won’t start to care just because you “learned AI.” The value proposition of AI tools is not “more productive colleagues” but “no colleagues at all.”
We cannot repair the service relationship by getting really good at the tool made for rejecting service relationships. All it does is accelerate the collapse of those relationships — and along with them, the collapse of design’s professional identity. The feedback loop between making and thinking is replaced with a unilateral, janitorial responsibility: stakeholders make, designers clean up.
The dominant activity becomes not making but preventing the system from producing something mediocre.
What’s left of the design process is now in tatters. We already talked about the “brain fry” associated with checking an endless stream of AI outputs in a prior issue. But when design is the designated janitor, all that brain fry becomes concentrated in ourselves; the pain of using AI tools is invisibly displaced onto people told that they should be happy to still have a job.
The worst part is that you can’t QA your way into faster velocity, because each review cycle makes you 10x slower. You can only accelerate delivery by eliminating the need for review stages, which requires teams to embrace a culture of quality. And when designers refuse to serve as a backstop for sloppy thinking, we see that the opposite has happened: without our help, the entire system has rapidly eroded. AI has managed to 10x the noise, but not the value.
The tools can’t help us anymore. We have to help ourselves.
Fortunately it's not too late, because being the person who made the tool do the thing is not design, and it was never design. Mastering Claude Code to push out prototypes more quickly will only intensify the problem that we find ourselves in, by putting more and more focus on a thing that lets us feel productive while the value-creating relationships around us fray into uselessness.
Prototyping was only ever just another tool to help us make one of any types of artifact that could then scaffold the design process. As soon as we began to over-index on the thing and not what we did with it, everything fell apart.
Because that’s where the actual productivity comes from: the relationships. Socializing your idea is the real value of the prototype:
A prototype that isn’t socialized isn’t a prototype at all, it is just an early attempt at making something: the focus is on the object being made, not the idea it represents. And it’s that human socialization and collective meaning-making that turn an artifact-in-process into a prototype that is refining something outside of itself.
The most immediate way in which socializing helps the prototype is that other people (experts in the domains you are not) can tell you that it’s wrong. Which is something that often feels forgotten: the purpose of prototyping is to try something in the cheapest format possible, so that we can come to know the ways in which it is wrong.
The promise of the AI industry — that you too can become a Leonardo da Vinci style polymath — is thus counterproductive to how ideas evolve and mature. People can have contextual knowledge; nuance that turns yet another generic app idea into a success. AI can only give you assumptions rendered down from one hundred million solutions, divorced from the problem in front of you. You end up shipping the same template as everyone else.
You learn nothing that way. And trying to accelerate that process by talking to even fewer people will lead to a lot of waste that can never produce the right solution. The AI will keep telling you that you are brilliant, and nobody else will care, because you will have automated the process of not listening.
Designers and product people [tried] to turn talking to people into terms engineering people find more cuddly. Like "framework". Or "system". … The problem isn't that you need a better system. The problem is you're avoiding doing the work.
Juicing up your AI skills makes you a better handmaid to the process of not listening. But it is not the process from which design derives its legitimacy, nor the process by which it creates value.
To close the loop with the first link: think about what you are choosing to regularize (or what is being chosen for you), and whether it makes space for beneficial complexity or drives it underground.
— Pavel at the Product Picnic
