Review: Pilot Competency and Capability
Review of Pilot Competency and Capability: Responsibilities, Strategy and Command by Steven D. Green
Some disclosures before we begin. First, I know the author. Second, this book sits squarely in my professional domain. Third, the thesis aligns so closely with what this Substack has been exploring that my confirmation bias is probably doing a victory lap. I tell you this not to undermine what follows, but because intellectual honesty requires it, and because the book itself would demand nothing less.
Steven D. Green’s Pilot Competency and Capability is an academic text that takes on one of aviation’s deepest philosophical tensions: the gap between what we say the pilot is — the final authority, the capable and competent aviator — and what the industry has been systematically building the pilot into: a systems operator within a cybernetic control loop.
Green’s argument is built on margins. Not just the margins pilots think about every day like landing distances, fuel reserves, and obstacle clearance, but also on the deeper margins that make safe operations possible at all: the margin of residual attention, the margin between work-as-imagined and work-as-done, the margin between what can be specified in a procedure and what actually keeps an airplane safely on a trajectory through an open, complex, tempestuous environment.
Green argues these margins exist to absorb the unforeseen. They are not there so that operational optimization could find more things to shave time or cost off of. When we shrink them — whether to save fuel, to increase throughput, or to squeeze a few more seconds out of a turnaround — we are spending safety capital that was never ours to spend.
It is, to me anyway, very easy to see how these concepts generalize broadly to our society today, with the exception that the society at large has none of the defenses against that encroachment that the aviation industry has built over decades of literal blood, sweat, and tears.
This is not a new argument in safety literature, but Green grounds it with unusual depth. He traces the concept of prudence to its Latin root, providentia — foresight — and builds a framework around St. Thomas Aquinas’s structure of practical reason: take counsel, judge what you have learned, then execute command. Command, in Green’s telling, is the action that follows judgment. It is what people do with their free will. And it is precisely the thing that gets compressed when we proceduralize everything, because the cybernetic approach to automation treats standard operating procedures as lines of code and the human being as a machine executing them.
One could say I’m a fan of cybernetics, or at the very least drawn to some of its concepts, so when Green takes aim at it, I was on guard. For the cybernetic approach to work, Green argues, the operating environment must be a closed system.
Aviation is no such thing. One could argue very few things operating out there in the real world are. Life is no such thing.
The cybernetic approach to automation, he writes, is a digitization of Herbert Simon’s 1947 premise that two people with the same information will rationally decide upon the same course of action — a premise Green identifies as a definitive statement of determinism, invested in what Sidney Dekker calls naïve Newtonian scientism. The belief that if we can just specify enough variables, control enough inputs, and standardize enough outputs, we will achieve safety.
We won’t.
And Green’s own experience gives this weight. Early in the book, he recounts a 1978 crash in a New Hampshire forest that killed his best friend and very nearly killed him. Over forty years later, he still doesn’t know what caused the airplane to sink into the trees on a calm summer evening.
The hidden axiom of the capable-and-competent pilot is the unstated assumption that the pilot will always see the threat coming.
Tell that to Richard de Crespigny of QF32 or Kevin Sullivan of QF72.
The accident you’ll actually have is the one you won’t see coming, and the capability to manage that uncertainty is exactly what gets eroded when we replace judgment with procedure. Those who see automation as the path to safety should note that the tail of things that can and does go wrong is far longer than any system designer can specify.
Readers of this Substack will recognise the through-line. What Green describes as the margin of residual attention (the capacity that must be protected so the pilot can identify stuck programs, spot unruly trajectories, and shoot them down) is the operational face of what James C. Scott calls métis: practical knowledge that resists codification because the environments in which it operates are too variable for formal procedures. Green’s entire framework of prudence, command, and the ecology of action maps onto Scott’s argument about what gets destroyed when systems demand legibility above all else.
And it connects directly to what I called the grace margin here — the space between what a system prescribes and what a person actually does. Green’s observation that controlling and anticipating workers’ compliance naturally generates pushback from people who desire to retain an identity of craftsmanship is the aviation-specific version of a universal truth: when you proceduralize everything, you don’t get compliance. You get shortcuts, workarounds, and optimizing violations, because the people doing the work understand something the system designers don’t.
The book is dense. It is academic in tone and structure, rich with references to Amalberti, Morin, Weick, Reason, Dekker, and Hollnagel. It covers a lot of territory like ampliative reasoning, self-organized criticality, fat-tailed probability distributions, and cognitive dissonance theory, some or all of which will be unfamiliar to many people. Green doesn’t condescend, or explain to the lowest common denominator. He doesn’t simplify where simplification would distort. This is not a book for a casual audience, and it doesn’t pretend to be.
It is also, inevitably, a book rooted in American aviation regulation. The FAR references, the NTSB case studies, the FAA-centric regulatory framework are appropriate to the subject but will require some translation for readers operating under EASA, CASA, or other regulatory regimes. The underlying arguments transcend jurisdiction, but the direct examples don’t always.
Amidst all that complexity, it can be easy to lose some of the key messages. Thankfully, they’re driven home with more plain language as well, such as this:
”We are not systems managers; we are managers of uncertainty.”
If you read one sentence from this review, make it that one.
Green closes by returning to Saint-Exupéry’s elemental divinities (the mountain, the sea, the wind) and arguing that automation, like the sky itself, is an open system. We can, at best, debate on terms of equality with it. We must never assume superiority to it. And we must vest the pilot with the authority and the capability to manage that debate, because the responsibility is towering, moral, and it can reside nowhere else.
Rating: 4.5 out of 5
Dog-ear index: 12.3
Who is it for: Aviation professionals (especially pilots, instructors, safety specialists, and human factors researchers) who want the philosophical and theoretical underpinning for why proceduralization and automation alone will not be sufficient. Also for anyone working in safety-critical systems who suspects that the industry’s faith in procedures and technology has outpaced its understanding of what actually keeps complex operations safe. This is not a casual read. It will reward those who come to it with some background.
[reminder: I highlight important parts of the books I read, and dog-ear the really important pages. The dog-ear index is simply the average number of dog-eared pages per 100 pages]
This first chapter of the book is published as Open Access, so it is freely available online at https://library.oapen.org/bitstream/handle/20.500.12657/62009/1/9781003369677_10.1201_9781003369677-1.pdf. For the full book, please support your local bookstore where possible (who are not likely to have it in stock, but should be able to order it for you); Amazon product link for reference: https://www.amazon.com.au/Pilot-Competency-Capability-Responsibilities-Strategy/dp/1032439742/


