Enterprise Architecture

TOGAF Implementation Without the Pain

TOGAF Without Pain: What Framework Adoption Actually Looks Like in Practice

There is a well-documented pattern in enterprise architecture that no one talks about directly: the TOGAF (The Open Group Architecture Framework) implementation that looks mature from the outside and functions as a PowerPoint collection from the inside.

NovoCircle’s The Long Arc: A Practitioner’s Guide to the Six Stages of EA (Enterprise Architecture) Evolution maps how enterprise architecture practices evolve through six distinct stages, and locates TOGAF adoption as a specific capability transition — not a destination. This post applies that framework to the gap between TOGAF-on-paper and TOGAF-operational.

The Architecture Review Board exists. It meets quarterly. The Architecture Development Method is described in a governance document. The team has completed TOGAF certifications. The framework has been formally adopted. And yet the repository cannot be trusted for reporting. Governance submissions routinely arrive incomplete. The standards defined in the framework documentation are interpreted differently by different architects. Every quarter, the team spends days manually reconciling data before a report can go out.

TOGAF was adopted. TOGAF was not made operational. These are different things, and the gap between them is where most TOGAF implementations actually live.

What TOGAF Is and Isn’t

TOGAF (The Open Group Architecture Framework) is one of the most widely used enterprise architecture frameworks in the world. It provides a comprehensive methodology for developing, managing, and governing enterprise architectures — including a development method (the ADM), a content framework, a reference library, and a governance framework.

What TOGAF is not is a substitute for the foundational work that makes a framework operational. TOGAF describes what architecture governance should look like. It does not automatically produce how your specific organization will implement it. The distance between those two things is where the practical work lives.

Most TOGAF failures are not failures of the framework. They are failures to build the scaffolding that makes the framework function. The framework exists on paper. The scaffolding — the modeling standards, the consistent application of element types, the governance processes with teeth, the repository quality that makes the framework’s outputs trustworthy — exists partially or not at all.

The Stage 4 Paradox

Understanding why TOGAF often doesn’t deliver what it promises requires understanding where it fits in the architecture maturity arc.

TOGAF, properly implemented, is a Stage 4 to Stage 5 tool. It gives organizations the common vocabulary (Stage 4) and the governance processes (Stage 5) that allow architecture work to become consistent and trustworthy across the enterprise. For organizations that have built Stage 3 capability — a connected repository with elements that have identity and persistent relationships — TOGAF provides the structural scaffold that makes Stage 3 extend into Stage 4 and Stage 5.

For organizations that are still at Stage 2 — where architecture is captured in presentation tools rather than repositories — TOGAF adoption does not move them to Stage 5. It produces TOGAF documentation sitting on top of a Stage 2 practice. The governance processes are defined. The models underneath them are not governed. The framework is real. The maturity it describes is not.

This is the single most common misdiagnosis in EA practice development. An organization adopts TOGAF and calls itself Stage 5. The repository tells a different story.

What Actually Makes TOGAF Operational

TOGAF becomes operational when six things are in place simultaneously. Any four of them, without the other two, produces a practice that looks more mature than it is.

1. A repository, not a diagram collection. TOGAF’s content framework describes elements, relationships, and artifacts. Those things need to live in a modeling environment where elements have identity and relationships are persistent, not in presentation files where a box is a shape. This is the Stage 3 prerequisite that TOGAF requires but does not create.

2. Modeling standards with implementation teeth. TOGAF’s content metamodel defines what types of elements and relationships exist. Your organization’s modeling standards define how those types are applied in your specific environment — what “capability” means in your context, what the correct top-level taxonomy for your application portfolio is, what relationship types are used and what they mean. Standards documented in a governance document but not enforced in the repository are decorative. Standards built into the repository’s configuration — element type libraries, required properties, relationship constraints — have implementation teeth.

3. An ADM that matches your actual architecture program. The TOGAF ADM is a prescriptive development method. It is also a framework that most organizations adapt rather than implement verbatim, because the full ADM is designed for large-scale enterprise architecture programs and most organizations are running something smaller, faster, or more targeted. The adaptation has to be conscious and documented. Organizations that try to implement TOGAF-as-written often find that the overhead of the method exceeds the capacity of the team. Organizations that adapt it without documenting the adaptation end up with inconsistent practice that each architect implements differently.

4. A governance process with defined authority. TOGAF’s governance framework describes how architectural decisions should be made, reviewed, and enforced. Making this operational requires defining who has decision authority for which categories of decision, what the escalation path looks like when a decision is contested, and what happens when a project proceeds without following the governance process. Governance without authority is not governance — it is advice.

5. Consistent reference framework application. TOGAF recommends alignment with industry reference architectures (BIAN for banking, TMForum for telecoms, DoDAF for defense, and others). Where these are relevant to your organization, they provide starting points for element taxonomies and capability maps that accelerate Stage 4 and Stage 5 work. Inconsistent application — where one business unit uses a reference framework and another doesn’t — produces exactly the non-mergeable models that signal Stage 4 is incomplete.

6. Metrics that reflect model quality, not activity. Most EA governance processes measure activity: how many reviews were held, how many models were submitted, how many certifications were completed. None of these measures model quality or repository trustworthiness. A team that processes fifty governance submissions per quarter and has a repository full of incomplete, inconsistently modeled elements has high activity and low Stage 5 readiness. Building metrics that track repository completeness, standards compliance rates, and data trustworthiness is essential to knowing whether TOGAF is operational rather than performed.

The Common Failure Mode to Avoid

The most expensive TOGAF implementation failure is also the most preventable. An organization purchases an enterprise-grade EA platform, spends six months configuring it according to TOGAF’s content metamodel, trains the team on the platform, and then migrates all existing diagrams from the presentation tools they were using before.

The migration takes another six months. At the end of it, the organization has an enterprise-grade platform containing twelve hundred PowerPoint diagrams that have been recreated as elements in the new system. The elements have identity, but they were created hastily, with inconsistent naming, and without reference to the modeling standards that were documented but not yet embedded in the platform’s configuration. The repository is not more trustworthy than the diagrams were. It is just harder to update.

The lesson: the platform configuration and the modeling standards have to come before the migration. The governance process has to be in place before the submissions start. The sequencing matters more than the tool choice.

A Framework Conversation Worth Having

None of this is an argument against TOGAF. It is an argument for being clear-eyed about what TOGAF requires in order to deliver what it promises.

The organizations that get the most value from TOGAF are the ones that go into adoption understanding that the framework is a scaffold, not a foundation. The foundation has to be built first: a connected repository with elements that have identity, modeling standards that are enforced rather than documented, and a governance process that has organizational authority behind it.

When those foundations exist, TOGAF provides exactly the structural coherence it is designed to provide. The ADM gives teams a consistent development approach. The content metamodel gives the repository a consistent structure. The governance framework gives decisions a consistent authority. The reference frameworks give taxonomy decisions a starting point that doesn’t require reinvention.

If your organization is in the process of adopting TOGAF — or has adopted it and is finding that the expected outcomes aren’t materializing — book a discovery call. The gap between TOGAF-on-paper and TOGAF-operational is well-mapped. Getting from one to the other requires specific work in a specific sequence.

Read the full stage-by-stage framework in The Long Arc: A Practitioner’s Guide to the Six Stages of EA Evolution.

Ryan Schmierer is the founder of NovoCircle, a technology advisory practice specializing in Modern Enterprise Architecture and Intelligent Automation.

Ryan Schmierer Sr. Managing Partner, NovoCircle

Ryan Schmierer is Sr. Managing Partner at NovoCircle with 25+ years of enterprise tech experience at Cisco, Microsoft, and Sparx Services.

Connect on LinkedIn

Ready to have a conversation?

No pitch. Just a conversation about where you are and what you're trying to do.