Skip to main content

Essay

The Case for Forward Deployed Engineers

April 2026

Forward Deployed Engineering Infrastructure

There is a prevailing fantasy in the technology industry that software sells itself. That a sufficiently elegant product, built by sufficiently talented engineers in a sufficiently well-lit office, will naturally find its way into the operations of complex institutions and begin delivering value. This is a fantasy. It has always been a fantasy. And in the age of AI, it is a dangerous one.

The hard problems in artificial intelligence are not, as many suppose, primarily technical. The transformer architecture is known. The training techniques are published. The inference infrastructure, while non-trivial, is increasingly commoditized. What remains genuinely difficult—what separates systems that work in demos from systems that work in the world—is the messy, unglamorous, deeply human work of embedding institutional knowledge into software.

This is the work of the forward deployed engineer.

The Knowledge Gap

Every organization of meaningful size is an epistemic universe unto itself. It has its own language, its own constraints, its own pathologies, its own hard-won heuristics about what actually works. A logistics company does not merely “move goods.” It navigates a labyrinth of carrier relationships, seasonal demand curves, regulatory regimes, and institutional memory about which routes fail in February and which warehouses cannot be trusted with fragile inventory. A healthcare system does not merely “treat patients.” It orchestrates a thousand overlapping workflows shaped by compliance requirements, payer relationships, physician preferences, and decades of accumulated clinical intuition.

This knowledge does not live in databases. Much of it does not live in documents. It lives in the heads of people who have been doing the work for years—people who often cannot fully articulate what they know because their expertise has become reflexive. It is tacit knowledge in the deepest sense, and it is precisely the knowledge that AI systems need in order to be useful.

No API call retrieves it. No fine-tuning procedure extracts it. No amount of prompt engineering conjures it into existence. It must be learned the way all deep knowledge is learned: through sustained proximity to the work itself.

Why Consulting Fails

The traditional model for bringing technology to complex institutions is consulting. A team arrives, conducts interviews, produces a document, builds something that approximates the requirements as understood at the time of the interviews, and then leaves. The institution is handed a system built on a snapshot of knowledge that was already incomplete when it was captured and begins to decay the moment the consultants walk out the door.

This is not a criticism of intelligence or effort. It is a structural observation. The consulting model is architecturally incapable of capturing the kind of knowledge that makes AI systems work, because that knowledge is not transferable through interviews and requirements documents. It is only transferable through the act of building together, iterating together, failing together, and learning together over a sustained period.

The result is an industry littered with expensive AI initiatives that looked impressive in the boardroom and collapsed on contact with operational reality.

The Forward Deployed Model

A forward deployed engineer does not visit. They embed. They join the Slack channels and the standups and the incident calls. They read the codebase not to audit it but to understand it—to understand why this service was split off three years ago, why that database is denormalized in a way that seems irrational until you learn about the migration that failed in 2019, why the team refuses to touch the billing module and what would have to be true for them to trust a change to it.

This is not overhead. This is the work. The accumulation of institutional context is not a preliminary step before the real engineering begins. It is the engineering. Every week of embedded presence generates compounding returns, because each piece of institutional knowledge absorbed makes every subsequent technical decision more precise, more appropriate, more likely to survive contact with the real operating environment.

When an FDE builds an AI system, that system reflects not only technical competence but operational wisdom. It knows which edge cases matter and which are theoretical. It handles the exceptions that actually occur, not the ones that appear in textbooks. It respects the workflows that exist for reasons no one wrote down but everyone understands.

AI Without Context is Liability

The current enthusiasm for AI has produced a dangerous asymmetry. The technology is powerful enough to act with consequence but not, on its own, wise enough to act with judgment. A language model can generate code, draft contracts, analyze financial data, and produce medical summaries. Whether any of those outputs are appropriate in a given institutional context is a question the model cannot answer, because appropriateness is a function of knowledge the model does not have.

This is not a temporary limitation that will be resolved by the next model generation. It is a structural feature of the relationship between general-purpose intelligence and domain-specific operation. GPT-5 will not know that your compliance team requires a specific approval chain for customer-facing changes. Claude will not know that your production deploys are frozen on the third Thursday of every month because of a regulatory reporting cycle. No model will know these things because they are not in the training data. They are in the institution.

The forward deployed engineer is the bridge. They translate institutional reality into system behavior. They are the mechanism by which AI stops being a generic capability and becomes an operational advantage.

What We Have Seen

The companies we work with are not merely adopting AI. They are becoming fundamentally different kinds of organizations. A team of four, paired with the right agent infrastructure and the right embedded engineering support, now ships what used to require twenty. Not because the AI is doing the thinking for them, but because the AI has been taught—through months of embedded collaboration—to operate within the contours of their specific reality.

Features that took quarters are delivered in weeks. Migrations that were deemed too risky to attempt are completed safely. Backlogs that had been growing for years are cleared. And critically, the capability compounds. Every system we build, every agent harness we deploy, every workflow we automate makes the next one faster, because the institutional knowledge has been encoded into infrastructure that persists after we leave.

This is what it means to imbue business knowledge into AI. Not through some abstract process of “knowledge management” or “digital transformation,” but through the patient, difficult, deeply human work of learning an institution from the inside and building systems that reflect what was learned.

The Conviction

We believe the organizations that will define the next decade are the ones that figure out how to make AI operational—not as a feature, not as a demo, but as core infrastructure that absorbs institutional complexity and converts it into sustained advantage. This cannot be done remotely. It cannot be done through vendor relationships. It cannot be done by purchasing software and hoping it adapts.

It can only be done by putting exceptional engineers inside the organization, giving them the time and trust to learn, and building systems that reflect the full depth of what they discover.

That is the forward deployed model. It is harder than selling software. It is less scalable than SaaS. It demands more of our engineers and more of our partners than any conventional engagement model. And it is, we believe, the only way to build AI systems that actually work.

We work with a small number of partners at a time.

If your organization is serious about making AI operational, we should talk.

Start a conversation