Skip to content

March 7, 2024

Accounting Versus Physics – Rewiring Organizations for Improvement 

By Summary by IT Revolution

In his insightful presentation at the recent DevOps Enterprise Summit, Scott Prugh, transformation technology leader, highlighted the critical differences between how we typically measure and manage software projects (using linear accounting-based approaches) versus how software projects and organizations actually behave (which follows more exponential, physics-based formulas). 

Prugh contends that relying too much on accounting approaches like estimates and utilization rates versus truly understanding concepts like coordination costs, coupling, and flow is a recipe for slow delivery, unhappy employees, and frustrated customers.  

In this post, we’ll summarize Prugh’s key points and examples regarding the need to rewire the complex “physics” of software delivery for real improvement.

The Three Layers of Organizational Change

Early in his talk, Prugh introduced a framework for different layers that exist within organizations, based on the book Wiring the Winning Organization from Gene Kim and Steven J. Spear:

  1. The Technical Layer: This lowest layer represents the developers, testers, and architects who work hands-on with the code and systems on a day-to-day basis.
  2. The Tools Layer: This middle layer encompasses all of the supporting tools and infrastructure relied upon to deliver software, like version control systems, CI/CD pipelines, telemetry stacks, infrastructure as code, work tracking platforms, and more.
  3. The Social Layer: The highest layer relates to organizational policies, architectures, processes, information flows, cultural norms, and behaviors. As Kim and Spear put it in their book, you can think of this as the “social circuitry” of the business.

Prugh stressed that making lasting, impactful changes solely at one layer is unlikely to move the needle. Just focusing on the technical staff and making developers more efficient can only address a small fraction of the overall lead time equation. Changes must occur across all three layers to fundamentally improve outcomes.

For example, if 85% of the duration of a feature delivery process is tied up in handoffs, reviews, and delayed decisions from organizational dysfunction, even a 50% improvement in developer throughput will only trim a couple of weeks off an 18-month delivery cycle.  

Understanding the Root “Physics” 

Next, Prugh dove deeper into what he called “physics-based” formulas that actually govern how work flows (or doesn’t) through complex technology organizations.

He touched on three key theories or phenomena that clash with how management traditionally views software delivery through rose-colored accounting lenses:

  1. Wait Time Presenteeism: As described in The Phoenix Project, when faced with relentless demands and overburdened employees, organizations inadvertently increase wait times and queues as workers scramble and context switch to try and cope. This creates further drag on flow.
  2. Coordination Risk: Based on research into complex systems, every single dependency chain or additional handoff cuts the chance of on-time delivery in half. More coordination equals more delays.
  3. Half-Life of Knowledge: As projects get handed off between teams, critical implicit knowledge gets lost. With each cross-team handoff, transmitted knowledge decreases by 50% as context vanishes.

In Prugh’s view, these physics-based dynamics will swamp any attempts to manage software delivery with traditional linear projections, resource allocation models, and efficiency metrics.

He emphasized that while accounting has its place in budgeting and governance, physics dictates the actual speed and quality outcomes. And yesterday’s org structures struggle with exponential coordination friction and complexity.

Rewrite the Formula with Simplification  

How do you rewrite the physics formula to win? Prugh’s answer: Simplification, which is inspired by Kim and Spear’s work in Wiring the Winning Organization.

While such a concept sounds easy, implementing organizational and architectural simplification initiatives involves real work and deep change across those three layers we discussed earlier.

Some examples Prugh highlighted included:

  • Modularizing systems along business domains versus outdated technical boundaries to localize complexity and dependencies.
  • Moving toward platform teams versus fragmented specialized groups to promote self-service. 
  • Embedding capabilities within cross-functional product teams versus disjointed groups that trigger contended handoffs.
  • Pursuing incremental delivery models versus large batch transfers and “big bang” integration headaches.  
  • Maximizing cohesion of functionality and minimizing areas of sprawl or contention.

All of these simplification moves help smooth flow, contain complexity, and speed up feedback loops, as long as they permeate policies, culture, architecture, and technical practices in a unified way.

An Example Transformation Story

To vividly demonstrate the potential impact of systems thinking and simplification, Prugh walked through a detailed example using before and after analysis of a real client scenario.

He described an initiative focused on one product portfolio suffering from lengthy delivery lead times and poor quality despite previous attempts to scale up capacity.

The starting state reflected many of the dysfunctions facing modern enterprises:

  • Hundreds of microservices scattered across unclear domains.
  • Disjointed specialized teams decoupled from actual customer value streams.
  • Islands of cloud infrastructure, platforms, and data stacks.
  • Brittle and manual testing and deployment approaches.
  • Loosely coupled architectural boundaries misaligned with business priorities.
  • Disconnected systems with high friction handoffs spanning domains and regions.

These intersecting issues led to slow flow, high degrees of waste, and coordination delays that manifested in multi-month queues and dismal reliability.

Despite leaders layering on more developers and test automation, lead times stretched ever longer. Previous legacy constraints mixed with accumulated technical debt prevented meaningful velocity.

Attacking this situation, Prugh helped the client re-visualize their delivery approach and restructure their value stream execution around simplification.  Some of the key changes included:

  • Consolidating platforms and stacks under a single cloud provider to reduce architectural contention.
  • Building a scalable event streaming backbone to connect systems and data.
  • Establishing multi-product teams organized around business-aligned domains versus old legacy systems. This reduced handoffs.
  • Co-locating related delivery capabilities within each stream-aligned team – planning, coding, testing, and release – to prevent decoupling.
  • Transitioning deployments and testing to automated self-service pipelines.
  • Instituting a continuous delivery model aligned with incremental working states.

While an arduous change, the collective impact fundamentally rewired the coordination and delivery physics plaguing this portfolio. Lead times shrunk from months to weeks. Escaped defects dropped 60%. Employee engagement improved markedly.

Summarizing Prugh’s Formula For Change

In closing out his presentation, Prugh underscored that software development involves much more physics than accounting and linear thinking. Terms like coordination costs, coupling, cohesion, and flow matter immensely.

He emphasized to technology leaders that if you solely focus on developer productivity tools or velocity metrics, you will likely fail to move the needle on business outcomes.

Instead, take inspiration from modern architectures and delivery methods like stream processing, event sourcing, and continuous integration/continuous delivery that embrace simplification. 

Then drive systemic changes across your organizational behaviors (like Lean thinking), processes (aligning value streams from idea through to customer) , and technical systems (loosely coupling boundaries around business capabilities).

This compounding effect creates positive momentum versus stagnation from fighting exponential complexity curves with linear models. The future belongs to physics. Are you ready to embrace it?


To watch the full presentation, please visit the IT Revolution Video Library here.

- About The Authors
Avatar photo

Summary by IT Revolution

Articles created by summarizing a piece of original content from the author (with the help of AI).

Jump to Section

    More Like This

    Discover the Formula for Repeatable Innovation
    By IT Revolution

    In their upcoming book, Unbundling the Enterprise: APIs, Optionality, and the Science of Happy…

    The Final Countdown – Investments Unlimited Series: Chapter 13
    By IT Revolution , Helen Beal , Bill Bensing , Jason Cox , Michael Edenzon , Dr. Tapabrata "Topo" Pal , Caleb Queern , John Rzeszotarski , Andres Vega , John Willis

    Welcome to the final installment of IT Revolution’s series based on the book Investments…

    Navigating the Ethical Minefield of AI 
    By IT Revolution

    As a business leader, you know that artificial intelligence (AI) is no longer just…

    Audit to the Rescue? – Investments Unlimited Series: Chapter 12
    By IT Revolution , Helen Beal , Bill Bensing , Jason Cox , Michael Edenzon , Dr. Tapabrata "Topo" Pal , Caleb Queern , John Rzeszotarski , Andres Vega , John Willis

    Welcome to the twelfth installment of IT Revolution’s series based on the book Investments…