Organisational Development

Boeing's Organisational Structure: An OD and AI Transformation Case Study

L
Labyrinth Coaching & Consulting
·April 2026·12 min read
Share
Boeing's Organisational Structure: An OD and AI Transformation Case Study

Introduction

Boeing's recent safety crises shocked the aviation world. But for those who study organisations, the warning signs were visible long before the headlines. What began as a story about engineering failures and cultural dysfunction has now entered a new chapter — one in which artificial intelligence is both exposing the depth of Boeing's organisational problems and raising urgent questions about how AI is reshaping the company's structure, workforce, and future.

This article examines Boeing through a comprehensive Organisational Development (OD) lens. We look at what went wrong, why it went wrong, and what the Boeing case tells us about the organisational challenges every leader faces as AI transforms how companies are built, run, and held accountable.

The Shift from Engineering to Finance Culture

Boeing's transformation from an engineering-led company to a finance-led one is well documented. When the merger with McDonnell Douglas in 1997 brought a new leadership culture, the emphasis shifted from "if it ain't broke, don't fix it" to "if it doesn't make money, cut it" (Lazonick and Shin, 2020). This cultural shift had profound consequences for safety, quality, and employee morale.

The consequences were not abstract. Engineers who raised concerns were sidelined. Quality control processes were weakened in the name of efficiency. The 737 MAX disasters that followed were not the result of a single bad decision — they were the result of an organisational structure that had systematically devalued the people and processes that kept aircraft safe (US Senate Permanent Subcommittee on Investigations, 2024; Lazonick and Shin, 2020).

What is less often discussed is how this structural shift created the conditions for AI to be deployed badly. When an organisation's culture prioritises financial metrics over engineering rigour, AI tools get used to accelerate production rather than to improve safety. The technology becomes an amplifier of the existing culture — for better or worse.

Boeing's Organisational Structure: What Went Wrong

To understand Boeing's crisis fully, it helps to look at its organisational structure. Boeing operates a complex matrix structure, with business units (Commercial Airplanes, Defence, Space & Security, Global Services) sitting alongside functional departments. In theory, this structure enables coordination across a vast, global enterprise. In practice, it created accountability gaps that proved fatal.

Several structural failures stand out:

  • Dispersed decision-making without clear accountability. When the 737 MAX's MCAS system was being developed, responsibility was distributed across multiple teams and suppliers. No single person or team had clear ownership of the system's safety implications (US Senate Permanent Subcommittee on Investigations, 2024).
  • Outsourcing of critical functions. Boeing's decision to outsource significant portions of manufacturing and engineering — including to Spirit AeroSystems, which produced the fuselage involved in the 2024 Alaska Airlines door plug incident — created structural dependencies that were poorly managed (Shea and Allon, 2025).
  • Geographic fragmentation of leadership. The decision to move Boeing's headquarters from Seattle (where its engineers were) to Chicago, and later to Arlington, Virginia, widened the distance between senior leadership and the people doing the work (Mintz and Miller, 2025). This is a classic OD warning sign: when leaders lose proximity to operations, they lose the informal intelligence that keeps organisations honest.

Psychological Safety and the Silence of Engineers

One of the most damning findings from investigations into Boeing's 737 MAX disasters was that engineers had raised concerns — and been ignored or silenced. This is a classic failure of psychological safety: the belief that one can speak up without fear of punishment or marginalisation (Edmondson, 2018). When psychological safety breaks down, organisations lose their most valuable early warning system.

The testimony of Boeing whistleblowers — several of whom died in circumstances that drew public scrutiny — painted a picture of a culture in which speaking truth to power carried serious personal risk (US Senate Permanent Subcommittee on Investigations, 2024). This is not an isolated Boeing problem. It is a systemic risk in any large organisation where financial pressure overrides professional integrity.

The AI dimension here is significant. As Boeing and other manufacturers deploy AI-powered quality control systems, the question of who can challenge an AI's output — and whether they feel safe doing so — becomes critical. If engineers were silenced when they raised concerns about human-designed systems, what happens when the concern is about an AI system that senior leaders have publicly championed?

AI and Boeing's Organisational Transformation

Boeing has invested heavily in AI and digital transformation in recent years. Its use of AI spans predictive maintenance, quality inspection, supply chain optimisation, and design simulation (McKinsey & Company, 2023). In principle, these are exactly the kinds of applications that should improve safety and efficiency. In practice, the results have been mixed — and the reasons why are instructive for any organisation embarking on AI transformation.

The core problem is that AI does not fix broken organisational structures. It amplifies them. Boeing's AI-powered quality inspection tools, for example, are only as good as the data they are trained on and the culture in which they operate. If the underlying culture discourages reporting defects, the AI will be trained on incomplete data. If the organisational structure does not give quality engineers the authority to act on AI-generated alerts, the technology becomes theatre rather than transformation.

There is also a workforce dimension. Boeing's AI investments have accelerated automation in manufacturing, raising difficult questions about workforce planning, skills development, and the role of experienced engineers in an increasingly automated production environment. The loss of institutional knowledge — the kind that lives in the heads of long-serving engineers, not in databases — is a structural risk that Boeing has not yet adequately addressed.

The OD Lessons for Every Leader

The Boeing case is one of the most comprehensive OD case studies available precisely because almost every dimension of organisational health was implicated in its crisis. Culture, structure, leadership, psychological safety, change management, workforce development — all of them failed, and all of them are now being tested again as Boeing attempts to rebuild under AI-era pressures.

For leaders navigating their own AI transformation journeys, the Boeing case offers several clear lessons:

  • Culture determines how AI is used. Before deploying AI, examine the culture into which it will land. An organisation that punishes honesty will use AI to enforce compliance, not to improve performance.
  • Structure must enable accountability. AI transformation requires clear ownership of decisions, including decisions made by or with AI systems. Matrix structures that diffuse accountability are a liability in an AI-enabled environment.
  • Psychological safety is a prerequisite for AI adoption. If people cannot safely challenge AI outputs, organisations will make worse decisions, not better ones. Building cultures where people feel safe to question — including questioning technology — is not a soft skill. It is a strategic imperative.
  • Proximity matters. Leaders who are distant from operations — whether physically or culturally — lose the ability to sense what is really happening. AI dashboards and data visualisations are not a substitute for genuine organisational intelligence.
  • Workforce transition is an OD challenge, not just an HR one. The displacement of experienced workers by AI systems is not simply a headcount question. It is a question of organisational memory, capability, and resilience.

What Boeing's Recovery Tells Us

Boeing's recovery — if it comes — will be an OD story as much as an engineering or financial one. The new leadership team faces the task of rebuilding trust with regulators, customers, employees, and the public simultaneously. They must restructure an organisation that has become too complex, too fragmented, and too disconnected from its engineering roots.

They must also navigate the AI dimension: deciding which AI investments to accelerate, which to pause, and how to ensure that AI tools serve the organisation's mission rather than its short-term financial targets.

For OD practitioners and leaders watching from the outside, Boeing is a live laboratory. The decisions being made now — about structure, culture, leadership, and technology — will determine whether Boeing becomes a cautionary tale or a genuine turnaround story.

Either way, the lessons are clear. Organisational health is not a luxury. In an AI-enabled world, it is the foundation on which everything else depends.

References

  1. Edmondson, A.C. (2018). The Fearless Organisation: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. Hoboken, NJ: Wiley.
  2. Edmondson, A.C. and Lei, Z. (2014). 'Psychological safety: The history, renaissance, and future of an interpersonal construct', Annual Review of Organisational Psychology and Organisational Behaviour, 1(1), pp. 23–43. Available at: https://doi.org/10.1146/annurev-orgpsych-031413-091305
  3. Lazonick, W. and Shin, J.S. (2020). The Transformation of Boeing from Technological Leadership to Financial Engineering and Decline. Cambridge: Cambridge University Press. Available at: Cambridge Core
  4. Mintz, S.M. and Miller, W.F. (2025). 'The story of Boeing's failed corporate culture: Putting profits ahead of safety', The CPA Journal, June 2025. Available at: cpajournal.com
  5. Schein, E.H. (2010). Organisational Culture and Leadership. 4th edn. San Francisco: Jossey-Bass.
  6. Shea, G.P. and Allon, G. (2025). 'Boeing: wait, there's more', Strategy & Leadership, 53(2), pp. 160–166. Available at: Emerald Publishing
  7. US Senate Permanent Subcommittee on Investigations (2024). Boeing's Broken Safety Culture. Washington, DC: United States Senate.
  8. Zweifel, T.D. and Vyal, V. (2021). 'Crash: Boeing and the power of culture', Journal of Intercultural Management and Ethics, 4(1). Available at: CEEOL
  9. McKinsey & Company (2023). The State of AI in 2023: Generative AI's Breakout Year. Available at: mckinsey.com
  10. World Economic Forum (2025). The Future of Jobs Report 2025. Geneva: WEF. Available at: weforum.org

Found this useful? Share it with your network.

L

Labyrinth Coaching & Consulting

Labyrinth Coaching & Consulting — helping leaders, HR professionals, and change practitioners build lasting organisational capability.

Ready to go deeper?

Build your capability with Labyrinth

Our development programmes are designed for leaders, HR professionals, change and project managers who want to make a lasting difference.

Enjoyed this insight? Get our weekly newsletter.

Every week we share one practical idea on AI leadership, organisational transformation, and the future of work — direct to your inbox.