Don't Stop at Workflows: Build a Compounding Content System

Tim Metz

11 min

March 19th, 2026
Don't Stop at Workflows: Build a Compounding Content System
Share this post:

Join the pack. Get Animalz updates.

Subscribe to our newsletter for the latest updates and insights.

When we started work on our AI-powered LinkedIn service, we thought two workflows in AirOps would do: one to turn raw materials into briefs, another to turn briefs into posts. Our editors would tighten hooks, cut slop, and make sure the customer sounded like the customer.

We were right that we needed two workflows. We were wrong that they'd be enough.

To create differentiated content with AI, we had to build a system, with data infrastructure, customer-facing tools, and feedback loops.

Here's what it took to build, layer by layer, through what we've since called the AI onion.

First, Your Workflows Need to Work

To get to a system, you do need working workflows that don't produce slop. Most teams that try never get past this step.

The first version of most AI workflows is a long-winded way to produce garbage. Only by trying, tweaking, testing, failing, trying again, cursing at your computer, and doing more tweaking and testing and cursing do you arrive at something acceptable.

This process might sound like writing — writing, rewriting, editing, cursing, rewriting again, polishing — but it isn't. It's like engineering.

We learned this when we assigned Nathan Wahl, our Editorial Director, all-round curator of good taste, and resident Slop Slayer, to tame our workflows.

Nathan Wahl, Slayer of Slop, Protector of Quality, doing what he likes best.
Nathan Wahl, Slayer of Slop, Protector of Quality, doing what he likes best.

Nathan's editorial instincts are razor-sharp, so we thought that AI + Nathan would translate to high quality workflow outputs. But he suddenly spent his days wrestling with questions like:

  • Is the system improving when I remove this line from a brand kit?

  • Can this prompt handle edge cases?

  • Why did this batch produce garbage when the last one was fine?

  • Did something just really break again over here because I fixed something over there?

Nathan made progress, but he hated the work. So we let him refocus on craft and quality, and assigned a stubborn builder and AI enthusiast to the workflow problem: me.

It still took me several months (!) to get the workflows to do what we wanted, and that's when we finally advanced to the next layer of the onion — only to discover that working workflows create their own problems.

Behind mountains are more mountains.
Haitian proverb

Make Data the Beating Heart of Your System

Keeping data organized is hard enough. Add AI workflows, and you're drowning in it.

Working workflows blanket you in drafts, because pressing "Run Workflow" is as easy as it sounds. Meanwhile, customer materials keep arriving by email, Slack, and Google Docs. You lose track of what lives where, which version is current, and what's been processed.

When your workflows work, your Publish button starts looking like this, and you start drowning in data.
When your workflows work, your Publish button starts looking like this, and you start drowning in data.

We turned to databases to defend ourselves from this information assault.

We chose Notion because it's easy to access for both humans and robots. Our LinkedIn team can work where they're already comfortable — making edits, reviewing briefs, providing feedback — without having to go into AirOps each time. And tools like Claude Code can read from and write to it, which makes automation possible.

But to make this setup a reality, we had to build infrastructure.

Notion can't talk to AirOps directly — the data formats don't match — so we created a server-side router to be their translator. (Claude's idea, to be fair.)

With that in place, the databases started feeding the workflows. Adding a raw material triggers brief generation, which delivers the brief back to Notion, where AI checks it and sends it through content creation, with the finished draft landing back in Notion.

The router would become useful far beyond its original job: status triggers, Slack notifications, and automatic transcription of submitted YouTube links. (Diagram made with Claude Code and the Whimsical MCP.)
The router would become useful far beyond its original job: status triggers, Slack notifications, and automatic transcription of submitted YouTube links. (Diagram made with Claude Code and the Whimsical MCP.)

Eventually, we looked up at what we had willed into existence.

Our workflows were no longer at the center. Data was. We had built a content system: infrastructure and workflows working together to produce content at scale, improving through its own feedback loops. And it started handing us opportunities we never planned for.

Screw Efficiency, Build Your Dream Solutions

Speed. Cost reduction. Efficiency. That's where most teams' thinking about AI begins and ends. But it's too narrow a lens: you only ever optimize what you already have and never discover what you could build.

We figured this out by accident.

We needed to onboard our first customer for the new LinkedIn program. Our traditional intake form didn't capture the information the new system needed. With the customer ready to start, there was no time for a developer to build a new form. So I turned to the only possible solution: vibe coding.

During that adventure, I realized AI could probably do more than just create a new form.

Anyone who's worked with an agency knows the frustration of repeating information you've already provided. What if we fed AI all the materials from the sales process — call transcripts, proposals, research — to pre-fill as much of the form as possible?

A quick prompt and some website screenshots later, we had an on-brand, multi-step form with highlighted, pre-filled fields.

Customers loved it.

The old form often got filled in a hurry or not at all. The new one is longer than the original, yet 80% of customers complete it in less than 24 hours.

One of our customers even reached out to ask how exactly we built such a pre-filled form!
One of our customers even reached out to ask how exactly we built such a pre-filled form!

The form was a turning point. It showed that dream solutions we'd always written off as unrealistic — or simply didn't consider — were now within reach.

Once that clicked, the question became automatic: what's the dream solution?

We started applying this lens everywhere.

Many founders and startups don't have mature style guides (if they have them at all).

Old solution: schedule more interviews, chase more feedback.

Dream solution: build a style calibrator that captures someone's voice by letting them pick from custom-generated examples, rather than asking them to articulate a style they can't define.

Customers submitted more raw materials over Slack and email than our team could handle.

Old solution: a shared folder and spreadsheet.

Dream solution: a form that handles submissions end-to-end — extracting articles, transcribing videos, running QA checks — and lands everything in the database.

Then someone wanted to capture materials while browsing.

Old solution: copy-paste into a doc.

Dream solution: a Chrome extension that captures your current page.

Two years ago, a content team building its own tools would have sounded absurd. Now they're Tuesday afternoon projects.

Tap into the Feedback Loops

A system offers something workflows never will: the ability to learn from its own outputs.

Improve a piece of content by hand, and you've fixed that one piece. Improve a part of your system, and every future output gets better. That's the compounding effect the title of this article promises.

It starts with small tweaks. Adding a step that hunts down em dashes, or correcting a mistake in an author's bio that keeps producing strange anecdotes — fix once, fixed forever.

But the real compounding happens when the system's output loops back as input automatically.

From the start, we dreamed of LinkedIn engagement metrics flowing back into the system, but we had no clue how. As the system took shape — the databases, the router, the tools — we realized we could plug in third-party data.

We were already working with a LinkedIn scheduling tool and found out they had an API, so we connected it to our Notion databases.

Now, once every 24 hours, our router pulls published posts and their engagement metrics. AI and our team use this data to refine brand kits, adjust strategy, and tweak workflows.

(Diagram made with Claude Code and the Whimsical MCP.)
(Diagram made with Claude Code and the Whimsical MCP.)

We've also partnered with QueryM to monitor social media for trending topics. When something relevant happens, QueryM sends a signal to our router and adds raw material about the event to our database.

AI checks it, the brief workflow kicks in, and within an hour our team has ideas to review. They pick the strongest angle, send the brief through the next workflow, edit the draft, and customers have relevant content within hours of something happening in their industry.

The better the engagement data, the sharper the strategy. The sharper the strategy, the more relevant the signals QueryM catches. Every cycle compounds.

Add Speed Bumps or Fall Asleep at the Wheel

Most people are skeptical about AI — until the system starts producing content. I've watched this happen with multiple people, and the pattern is always the same.

Before we start building, they say things like:

  • "AI can't do this."

  • "I don't trust it."

  • "The output will not be good enough."

Then the system delivers its first posts and the skepticism disappears.

Our team was so impressed their first request was full autopilot mode: raw materials flowing into briefs, briefs into drafts; no human touch unless the system flags a major problem.

But after the first batches, patterns appeared that you'd never catch looking at a single post: topics repeated. Hooks sounded templatized. Tone drifted in subtle ways.

Each post still looked fine on its own. Zoomed out across a dozen, AI's fingerprint started to show. Yet by then, the team had already decided the system worked. They took on way more work than they could have without it and stopped paying close attention.

Sometimes I wonder if we made an airplane that is too easy to fly. Because in a difficult airplane the crews may stay more alert.
Bernard Ziegler, Airbus chief engineer (The Glass Cage)

The fix? Speed bumps: deliberate friction points that confront your team with the output before it ships, rather than trusting the system by default.

Our team now reviews and approves every brief before it moves to the next phase. Upstream, they decide which ideas to extract from raw materials, instead of the AI deciding. And we redesigned how drafts come out: less like finished posts, more like working documents, with pointers, checks, and decisions for the team to make before publishing.

The system is powerful. But it needs sharp humans staying awake and in control.

Your Compounding System Is an Unbeatable Moat

When your system first ships, it won't match your best people. A skilled marketer working alone will still produce better content.

What the human can't do is compound. Every fix they make improves one piece of content. Every fix you make to your system improves every future output. The human's growth curve is slow and eventually flat. Your system's keeps climbing.

Given enough improvement cycles, the system's curve crosses the human's, and the gap only widens. That crossing point is your moat.
Given enough improvement cycles, the system's curve crosses the human's, and the gap only widens. That crossing point is your moat.

Your competitors can use the same AI models you use. With effort, they can reverse-engineer your workflows. But a content system — with its data infrastructure, feedback loops, proprietary information, and the hard-won knowledge of how to keep it all running — is a competitive moat that can't be copied.

We built our LinkedIn content system through months of trial, error, and layers of problems we didn't see coming. But work through them all, and your system becomes a strategic asset that gets stronger every day, and harder to compete with every cycle.

If you're starting that journey or are stuck partway through, we'd love to talk about what we've learned and how we can help.

Thanks to Nathan Wahl, Peter Carleton, and Ronnie Higgins for reading earlier versions — and ensuring those didn't get published.