The Name Was the First Decision

Most AI architectures get named after technical metaphors. Pipelines. Hubs. Engines. Platforms. The name signals what the thing is for: processing, connecting, running, hosting.

We named ours The Atelier.

An atelier is a workshop. It is where artisans practice their craft. It is a word with weight. It implies that what happens there matters, that the people doing it have developed something worth developing, and that the work itself is the point. That is not an accident. Artisan Studios is a collection of artisans in a studio where craft happens. We did not reach for a technical metaphor. We reached for who we are.

That choice turned out to matter more than I expected.

Why Naming Is a Design Decision

When you name your AI architecture something generic, you have already made a values decision. You have decided that how the AI thinks and what it treats as important are implementation details, not design inputs. You have handed that to the vendor.

When you name it something that reflects your organization, you have made the opposite decision. You have said that the AI is an expression of who we are, not a tool you are running. That distinction shapes every subsequent choice.

The organizations that get AI transformation to stick do not treat the naming as branding. They treat it as the first values statement. Before a single line is written.

I have been thinking about why this is for a long time. Longer than the current AI moment, actually. In 2015, I wrote about what makes change fail inside organizations. The argument I kept coming back to was this: the conditions that make people willing to change are the same conditions that make change worth having. Trust. A sense that the thing being built reflects them, not just a decision made above their heads. Autonomy within something they recognize as theirs.

AI adoption is just change with better marketing. The conditions are identical.

The Role Was the Second Decision

Once we had the name, we had to decide what role the AI would play.

Not master. Not servant. We landed on Steward.

If you know Tolkien, you know the Stewards of Gondor: figures who held authority in trust, acted with real power, but always in service of something larger. The name was not chosen to make the architecture sound literary. It was chosen because it captured the relationship exactly. Autonomy within guardrails. Acting with authority. Holding something carefully on behalf of people who need to trust it.

That framing mapped directly to one of Artisan's core values: sculpt trust. Not build trust. Not earn trust. Sculpt it, the way an artisan shapes something, deliberately, with craft and intention.

When we said the AI's role was Steward, we were making a claim about what kind of trust we were building and for whom. That is not a philosophical abstraction. It is a design constraint. Every decision about what the AI does and does not do on its own, what it escalates, what it holds in trust: those decisions flow from that role definition.

What This Has to Do with You

I am not writing this to recommend that every organization build an Atelier. The name is ours. The metaphor belongs to Artisan's culture, not to yours.

The pattern is what matters. Before your organization deploys AI, two questions deserve more time than they usually get.

What do you call this, and why? If the name came from a vendor or a committee trying to sound modern, you have skipped a design step.

What role does the AI play, and what does that say about your values? Not the marketing answer. The working answer that shapes what the system actually does.

The organizations that can answer both questions clearly tend to build AI that their people recognize as being of them. The ones that cannot tend to build AI that works fine technically and gets quietly abandoned anyway.

One more dimension worth naming: the answers to both questions are not universal. What signals trust in Chicago does not signal trust in Tokyo. The role an AI can play inside a flat, egalitarian organization looks nothing like what works in a hierarchical, high-context one. The pattern of starting with identity holds across every geography. The expression of that identity has to flex.

I watched this happen with other kinds of technology for a decade before AI was the conversation. The pattern was the same then. It is the same now.

Originally published on Medium

← Back to Signal