Skip to content
Policy Bytes

The New Iron Triangle

The Iron Triangle of Painful Tradeoffs

For decades, defense acquisition has operated under a simple constraint known as the iron triangle. Any program can optimize for at most two of three variables: speed, cost, and capability. Choosing all three elements is impossible.

The Iron Triangle

Cost Speed Capability

The framework originates from a project management tool, where scope, time, and cost compete for priority. These elements create an unresolvable tension and force decision makers to make tradeoffs. The Pentagon’s biggest programs have lived, and even died, by this tradeoff.

A tradeoff has to be made, and prioritizing the wrong element can leave the whole program worse off. The F-35 Joint Strike Fighter is a canonical example, with the program prioritizing capability and speed over cost, and arguably failing at all three.

The Department of War paid contractors hundreds of millions in incentives to accelerate deliveries, while the cost variable absorbed the damage. But even with the sacrifice of taxpayer dollars, the tradeoff didn’t hold cleanly. By 2024, aircraft deliveries were late by an average of 238 days, and as of 2025, the DoW was still planning to ramp up production despite contractors’ inability to keep pace.

The Littoral Combat Ship attempted a different bet, choosing speed and cost as the primary goals of the program. The Navy bypassed traditional acquisition, basing concepts on commercial ship designs and postponing initial experimentation periods. The result was cost overruns, schedule delays, and ships that couldn’t perform their core function. The Navy began retiring LCS vessels after less than a decade of service.

The iron triangle isn’t a simple framework; it can be a warning and an omen. The Pentagon has spent decades trying to do the impossible and get their wish list. Now, AI is changing the terms of the tradeoff entirely.

The AI Iron Triangle and the Choice of Reliability

AI changes the variables of the triangle.

Traditional procurement tradeoffs were shaped by the enormous per-unit costs of hardware. For example, in 2025 191 F-35s were delivered, and with each single plane costing over $100 million before lifecycle expenses, the US Government has spent over $19 billion and counting. AI as a service is much cheaper. The Pentagon’s current contracts with frontier AI labs amount to nearly $200 million each for a year of service, and that buys access to models that can be deployed across thousands of use cases.

Cost isn’t the binding constraint anymore. Instead, AI introduces a new variable: reliability. This is the testing, validation, and assurance that these systems behave as intended — not just most of the time, but nearly all of the time — in high-stakes environments inherent in any military deployment of these technologies.

The iron triangle becomes speed, capability, and reliability. And the Pentagon is choosing speed and capability.

The New AI Iron Triangle

Reliability sacrificed Speed Capability

This is not a choice the Department is hiding.

The DoW’s 2023 AI adoption strategy emphasized speed of deployment as a top priority, with a few references to responsibility. The Replicator initiative went further, explicitly framing the challenge as fielding autonomous and AI-enabled systems quickly to counter China’s military AI growth.

In January, Secretary of War Pete Hegseth released a memo cementing an AI-first posture of “wartime speed,” with the Pentagon specifically needing to “accelerate like hell.” And when Anthropic pushed back on changing the terms of its Pentagon contract moving away from some guardrails, the DoW made it clear it wanted a full suite of capabilities above all other considerations.

Reliability is the variable being sacrificed. The question is whether the Pentagon will recognize the cost before it pays it.

The instinct to frame reliability as the enemy of speed is wrong and it misunderstands what military AI systems actually require. An AI targeting system that is fast and technically sophisticated but produces high rates of false positives isn’t capable. It’s a liability for everyone involved.

The Director of Operational Test and Evaluation’s FY2023 annual report discussed concerns and mitigation strategies about the adequacy of testing and evaluation processes for increasingly complex systems, including AI and autonomous platforms. The Government Accountability Office found that the DoW lacks data that are usable for AI, and given that armed conflicts are characterized by uncertainty, volatility, and deception, it is incredibly difficult to find representative data. These factors make military AI applications more brittle and prone to unpredictable behavior.

AI accidents aren’t a theoretical risk. The Patriot missile fratricide incidents showed how over-trust in automation produces catastrophic outcomes, coupled with operators lacking the tools and training to intervene.

Choosing Reliability and Speed: The Path to Sustainable AI in the Military

It is clear that reliability is what the Pentagon needs and that speed is what the Pentagon wants. This pairing could create a virtuous cycle that leads to better capabilities.

Robust testing and assurance processes, when built into the development pipeline rather than bolted on afterward, catch problems early, reduce costly rework, and build the operator trust that’s necessary for actual fielding. The Air Force’s Kessel Run software factory is a proof of concept. By combining developers, security operations, and IT operations into a single team, Kessel Run delivered war-planning software to users within months. The speedy delivery was not despite integrating security, but because of it. Reliability wasn’t a bottleneck; continuous delivery made reliability and security an accelerant.

A standardized procurement certification process would ensure AI systems meet reliability, security, and assurance benchmarks before operational deployment by the DoW.

Procurement’s dual functions as a capability enabler and as an implementer of policy requires deliberate design. But the Pentagon already has a model for AI reliability certification. The FAA’s airworthiness certification regime shows that reliability mandates and innovation can coexist. A vendor certification of AI reliability testing would apply the same logic to military AI.

Congress has already signaled interest. The FY2024 and FY2026 National Defense Authorization Acts included provisions on AI testing, responsible AI principles, and reliability frameworks. The natural next step is legislation that mandates certification, not just encourages it.

Congress must act to ensure that the Pentagon’s AI adoption does not repeat the worst patterns of traditional procurement, where speed and ambition outpaced accountability and resulted in overpriced, unsuitable programs.

AI reliability certification is not an obstacle to capability or maintaining a military at the forefront of AI capabilities. These reliability certificates ensure that these capabilities materialize on the battlefield when warfighters and commanders need it to work most.

Bibliography

Albon, Courtney. “Pentagon Taps Four Commercial Tech Firms to Expand Military Use of AI.” Defense News, July 15, 2025. Link

Atherton, Kelsey. “Understanding the Errors Introduced by Military AI Applications.” Brookings Institution, November 28, 2022. Link

Beachkofski, Col. Brian. “Making the Kessel Run.” Air & Space Forces Magazine, 2019. Link

Callahan, Robert. “The Practical Role of ‘Test and Evaluation’ in Military AI.” Lawfare, October 7, 2025. Link

Center for Arms Control and Non-Proliferation. “F-35 Joint Strike Fighter: Costs and Challenges.” Accessed 2026. Link

Corrigan, Jack, Owen J. Daniels, Lauren Kahn, and Danny Hague. “Governing AI with Existing Authorities.” Center for Security and Emerging Technology, Georgetown University, July 2024. Link

Department of Defense. Data, Analytics, and Artificial Intelligence Adoption Strategy. November 2, 2023. Link

Director, Operational Test and Evaluation. FY2023 Annual Report. 2023. Link

Government Accountability Office. Artificial Intelligence: Status of Developing and Acquiring Capabilities for Weapon Systems. GAO-22-104765. February 2022. Link

Government Accountability Office. F-35 Joint Strike Fighter: Actions Needed to Address Late Deliveries and Improve Future Development. GAO-25-107632. September 3, 2025. Link

Government Accountability Office. Littoral Combat Ship: Actions Needed to Address Significant Operational Challenges and Manage Risks. GAO-22-105387. 2022. Link

Goussac, Netta, and Vincent Boulanin. Responsible Procurement of Military Artificial Intelligence. Stockholm International Peace Research Institute (SIPRI), February 2026. Link

Hadley, Greg. “F-35 Deliveries Soared to New Record in 2025.” Air & Space Forces Magazine, January 8, 2026. Link

Halstead, Kateryna. “Moving Beyond Pilot Programs to Codify and Expand Continuous AI Benchmarking in Testing and Evaluation.” Federation of American Scientists, June 11, 2025. Link

Heatherly, Christopher J. “The Iron Triangle: Technology, Strategy, Ethics, and the Future of Killing Machines.” War Room, U.S. Army War College, May 11, 2019. Link

Hegseth, Pete. “Artificial Intelligence Strategy for the Department of War.” Memorandum. Department of Defense, January 12, 2026. Link

Hicks, Kathleen H. Defense Strategy and the Iron Triangle of Painful Trade-offs. Center for Strategic and International Studies (CSIS), June 22, 2017. Link

International Committee of the Red Cross (ICRC). “Artificial Intelligence in Military Decision-Making: Supporting Humans, Not Replacing Them.” Law and Policy (blog), August 29, 2024. Link

Jensen, Benjamin, and Yasir Atalan. “The Pentagon’s AI Problem Isn’t Algorithms, It’s Evaluation.” Center for Strategic and International Studies, December 19, 2025. Link

“Lockheed Martin F-35 Delivery Delays.” Aerospace Global News, September 5, 2025. Link

Perlo, Jared. “Anthropic Sues Trump Administration in AI Dispute with Pentagon.” NBC News, March 9, 2026. Link

Pincus, Walter. “Pentagon May Be Slow to ‘Paradigm Shift’ in AI-Driven Weapons.” The Cipher Brief, September 24, 2024. Link

Rudner, Tim G. J., and Helen Toner. “Key Concepts in AI Safety: Robustness and Adversarial Examples.” Center for Security and Emerging Technology, Georgetown University, March 2021. Link

Sapien, Joaquin. “How the Navy Spent Billions on the ‘Little Crappy Ship.'” ProPublica, July 10, 2019. Link

U.S. Congress. National Defense Authorization Act for Fiscal Year 2024. H.R. 2670, 118th Congress. Link

U.S. Congress. National Defense Authorization Act for Fiscal Year 2026. S. 1071, 119th Congress. Link

Vincent, Brandi. “‘Accelerate Like Hell’: Hegseth Moves to Reshape DOD’s AI and Tech Hubs.” DefenseScoop, January 13, 2026. Link

Vincent, Brandi. “Hicks Shares New Details on DoD’s Vision for Replicator Autonomous Systems, but Questions Linger.” DefenseScoop, September 6, 2023. Link