The Investigation Process Research Resource Site
A Pro Bono site with hundreds of resources for Investigation Investigators
Home Page Site Guidance FAQs Old News Site inputs Forums
. . . .. . . . . . . . last updated 8/8/09


to advance the
State-of-the-Art of
investigations, through
investigation process

Research Resources:

Search site for::

Launched Aug 26 1996.


Presented at the International Society of Air Safety Investigators
International Seminar, Barcelona, Spain; October 20, 1998

© Copyright, 1998 - Ira J. Rimson and Veridata, Ltd.
Reproduction in any form without the express written consent of the copyright holders is prohibited.
Posted at this site with permission

Investigating "Causes"
by Ira J. Rimson, P.E.
Veridata, Ltd.; P.O. Box 11008
Albuquerque, NM 87192-0008
Phone & FAX: 505-275-4970
E-mail: ""

Table of Contents

If "probable cause" is OK for safety (and I doubt it), it certainly is unsatisfactory for prevention, and entirely unacceptable for investigations. Who wants to pay for an investigation whose result is "probable cause" except someone who doesn't want to know for sure anyhow?    -- Hughes Chicoine, CFEI


Determining accident "Causes" has been a source of ignition for arguments in the aviation community for half a century. Few today are familiar with the origins of "Probable Cause" determination in air safety investigation. Fewer still understand how this uncritically accepted objective has contaminated subsequent investigations and hindered Investigating Agencies' conjoint legislative purpose: preventing accident recurrence.

Civil aviation accident investigation in the U.S. began amid fractious political and economic environments that aspired more to reward political connection than to prevent losses. The bureaucrat-attorneys who enacted the conjoint objectives of causal determination and prevention for investigating authorities failed to establish measurable definitions against which investigations could be assessed. The resulting absence of investigative rigor has been accepted uncritically to this day. The simplicity of early aviation systems accommodated rudimentary investigative methods. The current aeronautical environment's rapidly expanding technological complexity does not. We can no longer afford to indulge in unscientific, unvalidated, unstandardized investigations, which produce unverifiable conclusions. Recommendations arising from these confusions of unsubstantiable data lack any reliable means to gauge their efficacy for preventing recurrence.

The air safety community must acknowledge what the rest of the world already recognizes: its concentration on assigning "Probable Causes" has failed to halt preventable mishaps. Historic concepts of causation have been ineffectual. An alternate definition of Cause --that of precursor (as in cause-and-effect) --affords the opportunity to create a "Calculus of Causation". The added rigor would enable establishing investigation models as benchmarks against which standard, logic-based investigations can be evaluated, verified and validated, and provide timely identification of potentially successful intervention strategies to deter repetition.

Historical Background

The Crash

At 3:30 a.m. on May 6, 1935, TWA Flight 6, a Douglas DC-2 enroute from Albuquerque, New Mexico, to Kansas City, crashed sixteen miles south of Kirksville, Missouri, with five fatalities of the thirteen persons on board. One of the fatally injured was Senator Bronson Cutting of New Mexico. The political furor aroused by his death, and subsequent actions taken by both govern-ment and industry parties and congressional intervenors, profoundly influenced the purportedly objective investigation process. The discord which followed the Cutting Air Crash permanently dashed any expectations that air safety investigation authorities could determine causes of accidents objectively without the self-serving intervention of implicated parties. (Ko:84)

The Bureau of Air Commerce

The Air Commerce Act of 1926[1] was the U.S. Congress's response to government's evolving role in regulating the U.S. air transportation industry. The Act charged the Secretary of Commerce with fostering air commerce, issuing and enforcing air traffic rules, licensing pilots, certificating aircraft for airworthiness, designating and establishing airways, and operating and maintaining aids to navigation. Of the seven original organizational units which made up the Aeronautics Branch of the Department of Commerce, only three had actually been part of it. Others were scattered throughout the Department; e.g., the Airways Division was assigned originally to the Bureau of Lighthouses, and the Aeronautics Research Division to the Bureau of Standards. Consolidation of the dispersed units into one organizational entity was not achieved until 1934, when the former Aviation Branch was renamed the Bureau of Air Commerce.

The Act also assigned to the Secretary of Commerce responsibilities for establishing and maintaining a "high level of safety", and to "investigate, record and make public the causes of accidents in civil air navigation." The concept of accident "causes" was thus initiated into the regulatory process by which those statutory functions were to be exercised by the government.

The Response

The death of a U.S. Senator in an air transport crash infuriated members of the U.S. Congress, especially because of perceived personal conflicts between Senator Cutting and then-President Franklin Roosevelt.

...the Cutting crash was seen by many people as a tragic consequence of a bankrupt aviation policy --a policy, it was held, that neglected Government's responsibilities in air safety in favor of economy and political preferment.[2]

Just one week after the Cutting crash, the U.S. Senate entertained a resolution which called for the Committee on Commerce, or a subcommittee thereof, " investigate fully and thoroughly..." the aircraft accident that resulted in the death of "an honored member of this body" and "...any other accidents or wrecks of airplanes engaged in interstate commerce."[3]

The Senate passed the resolution on June 7, 1935.

The Investigation

In the meantime, Eugene Vidal, Director of the Bureau of Air Commerce, appointed a five-member Board to investigate the crash. It questioned 59 witnesses and filled more than 900 pages of testimony, reporting back to Vidal on June 4, 1935, less than a month after the accident. The Board concluded that:

The probable direct cause of this accident was an unintentional collision with the ground while the airplane was being maneuvered at a very low altitude in fog and darkness.[4]

There is obviously little argument to be made with the "...probable direct cause..." determined by the Board, except that it is not a cause. It is a statement of what happened.

The Board's "contributory causes", on the other hand, were not so innocuous, including:

Failure of the United States Weather Bureau to predict the hazardous weather conditions that developed in the latter part of the flight.

Granting clearance to the aircraft in Albuquerque by TWA ground personnel in the face of knowledge that the aircraft's two-way radio was not functioning on western night frequency.

Errors in judgment on the part of the pilot in command in proceeding with the flight after discovering that he was unable to communicate effectively with the ground. Failure of TWA ground personnel at Kansas City to expeditiously redispatch the airplane to a field where better weather existed when it became apparent that the ceiling at Kansas City was dropping to and below the authorized minimum for landing and while the airplane had sufficient fuel to fly to an alternate airport and still meet the Department of Commerce reserve-fuel requirements.[5]

The Board found " evidence...that the established Department of Commerce navigation aids or its personnel were not functioning properly...." The Bureau subsequently found that the performance of TWA personnel constituted violations of air commerce regulations, and cited the airline and its employees for at least six violations for which the Bureau assessed civil penalties against TWA in the amount of $3500.[6]

TWA and the U.S. Senate Sub-committee

Both TWA's President Jack Frye and Senator Royal Copeland, who headed the sub-committee appointed to head up the investigation initiated by Senate Resolution 146, counterattacked the findings of the Bureau's Investigation Board. Frye co-opted the Copeland Sub-committee's principal technical consultant, Colonel Harold E. Hartney, whose field inquiries soon uncovered numerous witnesses along the airplane's route of flight. That the witnesses' testimony turned out to be both internally and externally inconsistent with established facts and each other, respectively, apparently did not trouble these investigators. U.S. Post Office inspectors finally were called in to verify the more spurious claims of the Copeland Sub-committee staff. After demonstrating convincingly that the original Frye-Hartney hypothesis for the accident was totally unsubstantiable, the two dropped their initial theory in favor of another of similar implausibility.[7]

More significant for accident theorists was the testimony of Accident Board Members before the subsequent Senate hearings. Denis Mulligan, Chief of the Enforcement Division, testified that the accident "...started to happen long before the airplane reached the vicinity of Atlanta, Missouri." Likewise, R. W. Schroeder, Chief of the Air Line Inspection Service, testified that the accident started to happen when the pilot, still only a half hour out of Wichita enroute to Kansas City and knowing he was without two-way radio communication, first encountered instrument meteoro-logical conditions. Yet both TWA's Captain and dispatcher sanctioned continuation into the weather and a fruitless attempt to land at Kansas City before attempting to continue on to a suitable alternate landing field.

"They both gambled on getting into Kansas City," Mulligan concluded, "and they lost."[8]

The Sub-committee's Findings

Public perceptions of the Bureau of Air Commerce were summed up in the following contemporaneous observations published during the conflict between the Bureau and TWA:

Aero Digest wrote: "The seriousness of this controversy cannot be overestimated. It goes right to the heart of the Government's system of regulating the aviation industry." And if TWA's claims were correct, "...they expose a hitherto hidden disease gnawing at the vitals of America's air transport, the disease of politics in its most hideous form --face saving."

The Air Line Pilots Association's President detected "...a tendency on the part of the present regulatory agency to theorize too much, and, in the absence of immediate concrete evidence, the accident is conveniently blamed on the pilot."[9]

A Copeland Sub-committee aide reviewed the Bureau's records of Probable Cause determinations in air carrier accidents and found that " no case during the past ten years has the Department ever accused themselves of an accident." He concluded, "Naturally, they [Bureau personnel] have whitewashed themselves for ten years...." These apparent conflicts of interest increased the public's suspicion that high level appointments within the Bureau were based on political party affiliation rather than on merit and qualification.[10]

No-one was very surprised when the Sub-committee's Report proved to be remarkable for its disregard for factual accuracy, its misinterpretation of events, and its unrestrained bias:

Thus its reconstruction of the accident even concluded what purportedly was going through [Captain] Bolton's head as he guided his craft through the fog --all neatly arranged between quotation marks, just as if the committee, through some occult power, had communicated with the dead pilot.[11]

The sub-committee predictably absolved TWA of any responsibility in the accident.


President Roosevelt directed that the Secretary of Commerce " in and confer with a group of patriotic and widely known citizens who will cooperate with you in formulating plans for the control of this distressed situation...," which led to convening the National Accident Prevention Conference in December 1935. The Conference reported that funding cutbacks had led to the Bureau's being "...unable to keep pace with increased flying activities to such an extent that its regulation, construction, maintenance, and control of safety is scanty, and there now exists a state of critical inadequacy...." The air carriers identified deficiencies amounting to $31 million. The Bureau of Air Commerce reduced that estimate to $9 million, but the only possible source of funding could be the Public Works Administration, whose director pared the figure once again, to $5.12 million, the figure finally sent to the Bureau of the Budget. The Bureau of the Budget approved $500,000.[12]

In December, 1936, and January, 1937, five U.S. air carrier transports crashed. Four of the accidents resulted in fatalities. On February 28, 1937, Bureau of Air Commerce Director Vidal resigned. On March 1st, the Secretary of Commerce appointed Fred D. Fagg, Jr., formerly a professor at the Northwestern University Law School, as Vidal's successor. [In July, 1936, Fagg and Dean John H. Wigmore of Northwestern University Law School had begun codifying and standardizing issuance of the air commerce regulations.] Within the year he had reorganized the Bureau functionally and invited outside participants into the accident investigation process.[13]

More significant, Fagg was instrumental in developing and guiding the passage of the Civil Aeronautics Act of 1938, which established the Civil Aeronautics Authority as the single independent agency in change of all federal government aviation activities. The Act also established an independent Air Safety Board within the Authority. The three-man Board was completely separated from the Authority's other functions, and was empowered to investigate accidents and recommend prevention measures.[14] The Civil Aeronautics Act established the legislative requirement for the Air Safety Board to determine Probable Cause of aviation accidents, but failed to clear up the original semantic misunderstanding which arose from the Air Commerce Act's requirement for determining "causes of accidents". According to Jerry Lederer, first Director of the CAA's Office of Aviation Safety, Fagg and Wigmore considered the principles of flight sufficiently mysterious that ascertaining the actual causes of accidents would be impossible. That skepticism led them to fabricate new applications for established legal terms of art. By neglecting to differentiate the new usages from the established lexicon, they introduced linguistic vagueness, the principal insurmountable flaw in the investigation process which survives to this day.[15]

Uncertainty, Ambiguity and Vagueness

Careful and correct use of language is a powerful aid to straight thinking, for putting into words precisely what we mean necessitates getting our own minds quite clear on what we mean. --William I. B. Beveridge

These three linguistic concepts are probably known abstractly to investigators, who may never have examined their application to investigations and their objectives:[16]

"Uncertainty" = "not definitely ascertainable or fixed, as in time of occurrence, number, dimensions, quality, or the like"; in scientific terms, the object cannot be "classified"

"Ambiguity" = "having several possible meanings or interpretations, equivocal"; in scientific terms, the object cannot be "discriminated"

Vagueness" = "indefinite or indistinct in nature or character, as ideas, feelings, etc."; and thus incapable of being either classified or discriminated.

These concepts should be significant to investigators, who must eliminate as much uncertainty, ambiguity and vagueness as possible during the investigation process. Both uncertainty and ambiguity can eventually be resolved by sufficient data, information and facts. Vagueness, on the other hand, cannot. A vague term is itself inherently incapable of classification; its very definition lacks sufficient precision to enable either classification or discrimination. (McFr:93)

Attempts to transpose "cause" or "probable cause" from their legal definitions to scientific applications lead to increasing vagueness and indeterminacy. No amount of additional information or data can help us identify "cause" or "probable cause". The terms fail to define what the object is we're looking for. As a result, we cannot determine what it is, where to find it, how to get to where it might be, how to recognize it if by chance we do, or how to guarantee its identity.

What are "Causes"?

If you would converse with me, first you must define your terms. - Voltaire

The concept of "Cause", as applied to the objectives of air safety investigations, evolved without definition along an inconsistent pathway:

"Cause" first appeared as a duty of the Department of Commerce in §172 of the Air Commerce Act of 1926:

[The Department] shall... (e) investigate, record and make public the causes of accidents....

"Probable Cause" materialized in the 1934 Amendment to the Air Commerce Act of 1926[17], which merged the formerly scattered aviation functions of the Department of Commerce's Aviation Branch into the Bureau of Air Commerce:

... the Secretary of Commerce shall, if he deems it in the public interest, make public a statement of the probable cause or causes of the accident....

The Civil Aeronautics Act of 1938[18] established a Safety Board within the Civil Aeronautics Authority, assigning to it in §702 duties to:

(2) investigate such accidents and report to the Authority the facts, conditions, and circumstances relating to each accident and the probable cause thereof; and

(3) make such recommendations to the Authority as, in its opinion, will tend to prevent similar accidents in the future; [19]

The Federal Aviation Act of 1958[20] established both the Federal Aviation Administration and the Civil Aeronautics Board, assigning to the CAB responsibility for "...the promotion of safety in air commerce."[21] A duty of the CAB specified in §701 was to:

(a) investigate such accidents and report the facts, conditions, and circumstances relating to each accident and the probable cause thereof;

The Department of Transportation Act of 1966[22] established the Department of Transportation, interposing it as a managerial layer superior to the FAA; and the National Transportation Safety Board. In §5 it established the duties of the NTSB, including:

(b)(1) determining the cause or probable cause of transportation accidents...;[23]

The Independent Safety Board Act of 1974[24] established the NTSB as an independent agency of the U.S. federal government, and established among its duties in §304:

(a)(1) investigate or cause to be investigated (in such detail as it shall prescribe), and determine the facts, conditions, and circumstances and the cause or probable cause or causes of accidents...[25]

As the duties of successive investigative agencies evolved from determining "cause" to "probable cause" to "cause or probable cause" to "cause or probable cause or causes", agency managers and investigators were denied enlightenment by the absence of any more comprehensive statutory definitions of "cause" or "probable cause". A 1942 description of the CAA Air Safety Board 's interpretation of "probable cause" by Jerome F. "Jerry" Lederer, then its Director, has survived:

...we, therefore, endeavor to state how the accident happened and why. The "why" is our conclusion expressed in terms of probable cause and contributing factors.... It has been our endeavor to stick to a practical pattern which

establishes the proximate cause as the probable cause and sets up the underlying or more remote causes as contributing factors. (Le:42)[26]

I don't find that much help. Both "proximate cause" and "probable cause" are legal terms of art; i.e., they possess specific meanings within the context in which they are applied. The two terms are defined as follows in their legal applications[27]:

"Proximate Cause":

That which, in a natural and continuous sequence, unbroken by any efficient intervening cause, produces injury, and without which the result would not have occurred.

That which is next in causation to the effect, not necessarily in time or space but in causal relation.

The proximate cause of an injury is the primary or moving cause, or that which, in a natural and continuous sequence, unbroken by any efficient intervening cause, produces the injury and without which the accident could not have happened, if the injury be one which might be reasonably anticipated or foreseen as a natural consequence of the wrongful act.

An injury or damage is proximately caused by an act or a failure to act, whenever it appears from the evidence in the case, that the act or omission played a substantial part in bringing about or actually causing injury or damage; and that the injury or damage was either a direct result or a reasonably probable consequence of the act or omission.

"Probable Cause":

Reasonable cause; having more evidence for that against.

A reasonable ground for belief in certain alleged facts.

A set of probabilities grounded in the factual and practical considerations which govern the decisions of reasonable and prudent persons and is more than mere suspicion but less than the quantum of evidence required for conviction.

* An apparent state of facts found to exist upon reasonable inquiry (that is, such inquiry as the given case renders convenient and proper), which would induce a reasonably intelligent and prudent man to believe, in a criminal case, that the accused had committed the crime charged, or, in a civil case, that a cause of action existed.

* The evidentiary criterion necessary to sustain an arrest or the issuance of an arrest or search warrant.

If we attempt to apply the terms "proximate cause" and "probable cause" in an investigation context using their legal definitions, Lederer's interpretation lacks coherence. Worse yet, these legal definitions add the following vague abstractions to the original confusion:

reasonable anticipation


natural consequence

substantial part

direct result

reasonably probable

reasonable & prudent

mere suspicion

apparent state of facts

reasonable inquiry

convenient and proper

less than the quantum of evidence needed to convict

as well as still more undefined concepts of cause:

intervening cause

primary cause

driving cause

reasonable cause.

Rules of law often retain some vagueness to permit flexibility in applying them to unforeseen circumstances. (CoCo:90) Equivocation has utility for an attorney attempting to prove or refute the elements of legal proof required to convince a jury of a client's guilt or innocence. However, that elasticity is not conducive to scientific hypothesis testing. By choosing determination of undefined "causes" or "probable causes" as their principal investigation objective, investigation authoritiies create a quandary for investigators who attempt to apply objective methodologies to the study and understanding of how accidents happen. In the absence of any more precise definitions against which to measure their conclusions, it is impossible for investigators to "determine" the "causes" or "probable causes" of mishaps. In fact, within the U.S. NTSB "causes" or "probable causes" are assigned administratively by visionaries far removed from the investigation process.[28]

Detours on the road to "probable cause"

1. Anyone can make a decision, given enough facts.

2. A good manager can make a decision without enough facts.

3. A perfect manager can operate in perfect ignorance.

--Spencer's Laws of Data

The laxity of logical rigor imposed on categorical assignments of "cause" and "probable cause" permit designations which defy rational credibility, e.g.:

The evidence showed that the airplane stalled (loss of flying speed) shortly after takeoff, settled into the trees and caught fire. The stall was caused by a loose screw in the horizon indicator which prevented the horizon bar from rising in the horizon indicator. (Le:63)

After witness interviews, reviews of maintenance documents, engine records, and other data, the Accident Investigating Officer determined that the engine failed at some point, the aircraft stalled and impacted the ground. This caused the accident. (USAF:96)

The obvious logical flaws in these examples are apparent. However, similarly flawed evaluations are not so obvious when readers do not analyze the detailed investigation process and the facts which it purports to reveal. Let us take as an example some early causal arguments which arose following the crash of Korean Air Flight 801 on approach to Agana, Guam, M.I. On the day of the crash (August 6, 1997), a KAL vice-president announced on network television (CNN) that weather and the lack of an operative glide slope on the instrument approach were the "causes" of the accident. In a report two days later, CNN reported that the airline "angrily denied" that pilot error was a factor. Ladkin comments on these statements after analyzing their logic considering other facts known at the time:

Let us suppose that the weather and the lack of GS were causes. The weather forecast, which for Pacific islands a few hours ahead is moderately accurate, and the lack of GS were known to the pilot before departure (he is required to be briefed on weather and NOTAMs for the route of flight, including the airport of arrival). Since pilot error was not a factor (this is our, rather than Korean Air's, supposition), both of the sufficiently determining causal factors were known to the pilot before takeoff. Proceeding with the flight at that point was thus dangerous. Since both pilot and airline management must accede to taking off for the flight to occur, we conclude that both acceded to departure despite the fact that the flight could probably not be safely completed. ... A pilot judgment to pursue a course of action which the aircraft cannot safely complete is a pilot error, as the term is normally used. Let us call the corresponding action of the part of Korean Air a management action (even though a decision may not explicitly have occurred, the choice is nevertheless made. An act of omission may still be regarded as an action when an action is required, which it is, because the airline is responsible for monitoring the pilot's training, behavior and judgement). Since pilot error was not a cause (we are supposing), it cannot have been an error on the part of the pilot, therefore the corresponding management action on the part of the airline occurred....Was this management action causal to the accident? Had this management action not occurred, the flight would not have proceeded. Had the flight not taken off, it would clearly not have crashed in Guam. Therefore this management action is a necessary causal factor. Thus, from the supposition that weather and lack of GS were sufficient causes, and that pilot error was not a cause, we are led to conclude that a management action on the part of Korean Air was also a necessary causal factor. It follows from Korean Air's statement, then, that they themselves are partly responsible for the accident, since this management action lies within their power to make or not make. (Ld:97)[29]

Defining the Problem[30]

It is more important to know where you are going than to get there quickly. Do not mistake activity for achievement. --Mabel Newcomer[31]

For the past sixty years the U.S. government's aviation accident investigative authorities have been charged with three specific objectives:

  1. Determining the facts, conditions and circumstances of accidents;

  2. Determining cause(s) or probable cause(s) of accidents; and

  3. Recommending action to mitigate or preclude future recurrence.

From its outset the CAA's Air Safety Board chose to elevate the task of determining cause(s) or probable cause(s) to preeminence. That choice apparently has never been seriously questioned. Succeeding organizations have maintained the priority of determining causation even as it has become obvious that many assigned "cause(s)" cannot be substantiated by the facts, conditions and circumstances of the individual case. "Cause" is, by definition, vague; "probable cause" is even more vague. Accidents are the concluding events of processes which are frequently complex, having generated over an extended time period. Prerequisite conditions may have been established years or decades before.

One, or several, "probable cause(s)" cannot account for accidents' complex evolutionary process. The NTSB developed a predetermined menu of causes in futile attempt to generate standardized categories of causation. Analysts who assign causation can pick "one from column A" or "two from column B" confident of the conventional wisdom that one size cause fits all facts. The resulting data assures us that the principal "cause" of general aviation accidents is "Pilot failed to obtain or maintain flying speed". The NTSB has yet to discover that its acclaimed principal cause factor is not a "cause" at all; it is a description of what happened.

A corollary to vague specifications of causation is ineffectual preventive action. Recommenda-tions for mitigation or prevention have had little effect in forestalling the seeming inevitability of prevalent accident mechanisms. Preventing recurrence requires specific identification of what went wrong, and precise remedies to change behaviors which enabled the progress of the accident process, or failed to recognize and arrest it.

Assigning causes is passive. It is little more than assigning responsibility and accountability, or "blame". Preventing recurrence requires action. It necessitates identifying specific dysfunctionalities, tracing their origins and changing the behavior which led both to and from them.

(If you do not believe that investigators assign blame, I invite your reference to Gerard Bruggink's paper "To Kill a Myth" presented at the 1987 ISASI Seminar. (Br:87) He argues that "By emphasizing `Who Caused the Accident?' rather than `What Might Have Prevented It?', investigation authorities engage in weighing causes and, therefore, weighing blame. Causal summaries identify the individuals and organizations that seem to be most at fault, balancing between probable cause and contributing factors.")

The two objectives which the NTSB attempts to achieve -- determining cause(s) and preventing recurrence -- are countervalent; that is, they are so fundamentally inconsistent that increasing concentration on one diminishes the worth of the other. So long as we cannot even define what "causes" are, efforts expended in their quest are squandered. Worse yet, the more vague the subsumed causal elements, the more efforts must be devoted to searching for things which have not, and cannot, be defined.


Organizational inefficiencies arise from constraints. In his theory of Constraints (Go:90), Goldratt defines the nature of a "constraint":

A system's constraint is nothing more than what we all feel to be expressed by these words: anything that limits a system from achieving higher performance versus its goal. To turn this into a workable procedure, we just have to come to terms with the way in which our reality is constructed.[32]

Goldratt's Step 1: Identify the Constraint

The first step is to recognize that every system was built for a purpose, we didn't create our organizations just for the sake of their existence. Thus, every action taken by any organ --any part of the organization --should be judged by its impact on the overall purpose. This immediately implies that, before we can deal with the improvement of any section of a system, we must first define the system's global goal; and the measurements that will enable us to judge the impact of any subsystem and any local decision, on this global goal.[33]

The Air Commerce Act of 1926 and subsequent legislation formally mandated three objectives for U.S. government aviation mishap investigative authorities. It is apparent after 60 years that the agencies' concentration on the goal of determining causation has inhibited their ability to effect the others. They have continually succeeded in identifying something called "causes", yet they have failed either to identify accidents' facts, conditions and circumstances with rigor, or to produce recommendations which have prevented recurrence of similar accidents.

...our systems today are limited mainly by policy constraints. We very rarely find a company with a real market constraint, but rather, with devastating market policy constraints. We very rarely find a true bottleneck on the shop floor, we usually find production policy constraints. We almost never find a vendor constraint, but we do find purchasing policy constraints. And in all cases the policies were very logical at the time they were instituted. Their original reasons have since long gone, but the old policies remain with us.[34]

The NTSB's constraint is evident: it has chosen a policy which gives precedence to determining causes over preventing recurrence. Its investigators converge on the objective of determining the all-encompassing "Cause", and neglect legitimate cause-effect relationships which determine the progress of the accident processes. Thus they overlook potential early intervention points which possess realistic probabilities for effecting prevention.

Current investigation authority performance expectations cannot be elevated until the priority of objectives conflict is resolved:

Which purpose is more important --
   determining "cause(s)" or preventing recurrence?

The decision must be based on objective evaluation to determine which alternative has the greater opportunity for benefiting the aviation industry and its customers, not on convenience for investi-gators or the traditional practices of investigation authorities.

Goldratt's Step 2: Decide How to Exploit the Constraint

By `exploit,' Goldratt means we should wring every bit of capability out of the constraining component as it currently exists. In other words, "What can we do to get the most out of this constraint without committing to potentially expensive changes or upgrades?"[35]

The policy constraints which have arisen from the NTSB's conflicting objectives can be obviated by changing priorities. Were the first priority of the investigative authority to identify the "facts, conditions and circumstances" relating to the accidents which it investigates, the issue of causal vagueness would be eliminated. Facts, conditions and circumstances may be uncertain or ambiguous, but those problems can be overcome by additional facts, data and information.[36] Facts, conditions and circumstances are never vague. They are tangible, measurable descriptions of what happened. Once we know what happened, we can break the scenario down into specific events and conditions which encourage applying the tests and proofs of formal logic.

The logic of cause=>"Effect"

The machine does not isolate man from the great problems of nature but plunges him more deeply into them. -- Antoine de Saint-Exupéry

Four hundred years ago Bacon recognized that simple enumeration of events was inadequate methodology with which to conduct inductive logical analyses. John Stuart Mill (1806-1873) developed his classical canons of inductive inference which encompass concepts of causation and effect. (CoCo:90) These methodologies are essential tools which enable investigators to develop a Calculus of Causation; i.e., what happened --establishing with rigor the structure of causation which was precursor to the effect:

The observation that in causal explanation not just one "probable cause", but normally many causal factors explain the occurrence of an event, and that one cannot distinguish between "more necessary" and "less necessary" factors, is often attributed to John Stuart Mill; for example,... (GeHo:97)

It is usually between a consequent and the sum of several antecedents; the concurrence of them all being requisite to produce, that is, to be certain of being followed by the consequent. In such cases it is very common to single out only one of the antecedents under the denomination of Cause, calling the others merely conditions.... The real Cause is the whole of these antecedents; and we have, philosophically speaking, no right to give the name of causes to one of them exclusively of the others. [(Mi:43) cited in (GeHo:97)]

Mill's enduring principles have been adapted to the task of accident investigation and analysis in at least two specific applications. [(Be:97) and (GeLa:97a)] Benner calls his methodology "Multilinear Events Sequencing" ("MES"). It enables the investigator to apply sequential, cause-effect, necessity and sufficiency logic tests to develop a matrix of events leading to the undesired outcome.[37] MES can generate patterns of "event-pairs" which enable the investigator or analyst to compare discrete patterns among several accident processes. (Be:97)

Ladkin et al call theirs "WB-Graph" (from "Why-Because"). It formalizes a methodology which enables the investigator or analyst to identify significant system states and events, express them as propositional variables, and apply counterfactual testing to build a quasi-chronological graph of causal-factor relationships among sequential sets of variables. Although the methodologies differ in their semantics and logic testing, their outcomes and their utility to investigators are substantially similar. (GeLa:97a)

Goldratt's theory of Constraints, "[w]hen effectively applied,...empower[s] the user to identify precisely and to execute the one or two focused changes that will produce the maximum system improvement with the minimum investment of time, energy, and resources --and do it right the first time, without costly trial and error."[38] He proposes that each cause normally has more than one effect. To test the validity of the assumed Cause(Effect logic the analyst must find evidence of another expected coincident effect. [(Go:90) & (De:97)] Goldratt calls this phenomenon the "Effect-Cause-Effect" test. Dettmer restates it as follows:

If we accept that [CAUSE] is the reason for [ORIGINAL EFFECT], then it must also lead to [PREDICTED EFFECT(S)], which [do/do not] exist.[39]

The Need for Testing

The greatest tragedy of science is that you often slay a beautiful hypothesis with an ugly fact. --Thomas Huxley

Investigators can reduce their vulnerability to criticism from special-interests by aggressively applying Cause=>Effect logic, testing their hypotheses and demonstrating replicability. Conclusions based on demonstrated logical reasoning are decidedly less assailable than those which call for blind acceptance of the investigating agency's experience and reputation.

Cause=>Effect logic enables self-testing by hypothesizing counterfactual arguments; i.e.,

Hypothesis: If Cause A results in Effect B, then
  Absence of Cause A will result in absence of Effect B.

Test: Remove Cause A.

If Effect B still happens, then Cause A cannot be a "cause" of Effect B.

Let's look at a few examples:

In my article titled "Are These the Same Accident?" (Ri:83) I presented the outcomes of a General Aviation accident investigation. The field investigation report includes the following statement by the investigating officer (in this case an FAA inspector):

[The pilot] stated that he touched down approximately 150 feet past the runway threshold numbers; and

The right main landing gear disk brake assembly had failed at the area where it is welded to the brake disk housing. This area was badly corroded to the point where the metal in this area is paper thin.[40]

Furthermore, from the PIC's statement as to causation:

Brake Failure (Brakes are newly installed, one landing since overhauled) [sic][41]

The "Probable Cause(s)" stated in the NTSB's Brief Format for the same report are:

Pilot-in-Command: Misjudged distance and speed
Pilot-in-Command: Failed to initiate go-around[42]

Application of a simple counterfactual logic test demonstrates immediately the fallacies of both "probable causes"; e.g.,

Would the PIC's "correct" judgment of distance and speed have had effects that would have prevented the accident?

Would his "correct" judgment of distance and speed have prevented the brake failure?

Would a go-around have accomplished either of the above preventions?

[Ludi Benner and I used this accident to demonstrate the Multilinear Events Sequencing methodology in Part 2 of (BeRi:91).]

More than Just General Aviation

The results of Ladkin's WB-Graph analysis of the Lufthansa A320 accident at Warsaw in December, 1993, will reassure major accident investigators that they are capable of committing similar logical errors.

Upon landing, none of the braking systems (air brakes, thrust reverse, wheel brakes) functioned for about nine seconds: the wheel brakes only started to function after about thirteen seconds. The aircraft ran off the end of the runway, collided with an earth bank and started to burn. ... It became clear that the logic of the braking systems was indeed a reason why the braking systems hadn't functioned as expected. However, many commentators focused upon this factor as the main cause of the accident, which is probably incorrect. There were many other necessary causal factors. The final report itself ascribed pilot decisions and behavior as `probable cause'. But what criteria are being used to determine that?

Further into the logic testing process itself:

One can immediately observe...that node 3.1.2: earth bank in overrun path is a causally-necessary node: hitting the bank was a cause of the damage and fire; the hit directly killed one person and rendered the other unconscious and therefore unable to participate in the evacuation. Furthermore, this node itself is not caused by any other event or state in the sequence. It is therefore to be counted amongst the `original causes' of the accident, according to the WB-graph method. However, it does not appear amongst the `probably cause' [sic] or `contributing factors' of the final report. We have therefore found a reasoning mistake in the report. It is not the only such node of which this is true.

And finally:

What is the consequence of this rigorous reasoning? Once we have identified the position of the earth bank as an original causal factor, we know that had the bank not been where it is, the accident that happened would not have happened. (It is, of course, possible that the aircraft could have broken up and burned for some other reason...but it's certainly not as likely as in the case where there's something there to hit!) Therefore, one could consider repositioning the bank in order to avoid a repeat. However, this was not considered or recommended in the report, we suppose because the position of the bank was not considered to be a causally-essential feature in the report. (GeHo:97) (All emphases in the original)

A similar WB-graph analysis of the accident involving American Airlines Flight 965 enroute to Cali, Colombia, on 20th December 1990, has been performed and is available. (GeLa:97b) One conclusion in particular should concern investigators and those employing investigation reports for the purpose of prevention:

...the graph represents 59 states and events noted by the Cali accident investigation commission and the NTSB as being causally-relevant. In contrast, the report's[43]

Findings section lists only 16 of these (roughly a quarter), corresponding to 10 explicit findings.[44]

Dettmer cites eight Categories of Legitimate Reservation which should be tested to assure that analytical logic has been verified[45]:

1. Clarity

2. Entity Existence

3. Causality Existence

4. Cause Insufficiency

5. Additional Cause

6. Cause-Effect Reversal

7. Predicted-Effect Existence, and

8. Tautology.

Whatever hypothesis testing methodology investigating authorities choose to employ, its objec-tivity discourages the kind of special interest controversies that arose during the Cutting accident, and survive to do mischief to this day. By removing the "judgment calls" from the investigation and analysis process, robust logical assessment forces critics to demonstrate that their theories conform more plausibly with the factually-derived chronology of what happened. [46]

Where do we go from here?

He that will not apply new remedies must expect new evils, for
time is the greatest innovator. --Sir Francis Bacon

Before we decide where we want to go, we must determine accurately where we are in our progress toward fulfilling the objectives of investigations, both within government and outside it.

"Customers" and Accountability

Several years ago I asked a senior NTSB investigation manager to identify the agency's customers. He asserted that the NTSB had no `customers'.

In that case, I continued, how are quality standards for NTSB investigations established? By the other investigators, he declared. They were the only ones competent to evaluate the work of their peers.

I tried again: What about congressional oversight? Congress was necessary to appropriate funds, he said, but legislators and their staffs were generally too ignorant to be able to criticize the NTSB's work, although they managed to interfere continually.

Times have changed. The Congress has established a new purpose for the NTSB: providing assistance to families of passengers involved in aircraft accidents -- "the families of passengers involved in aircraft accidents within the United States involving an air carrier or foreign air carrier and resulting in a major loss of life."[47]

In my final editorial as editor of ISASI's forum I proposed, as a professional position for ISASI to take to help effect credible prevention strategies:

Oppose politically-inspired dissipation of investigative assets into areas beyond the missions and capabilities of investigation professionals. Investigators' responsibilities are hard enough to fulfill objectively and dispassionately without forced emotional confrontation interjecting subjectivity and potential bias. (Ri:97b)

I was wrong. The Aviation Disaster and Family Assistance Act for the first time designated real people to whom investigation authorities were accountable, real people who:

(1)...are [to be] briefed, prior to any public briefing, about the accident, its causes, and any other findings from the investigation; and
(2) are individually informed of and allowed to attend any public hearings and meetings of the Board about the accident.[48]

Although currently only applicable to the NTSB, it is likely that similar initiatives will spread worldwide. If so, investigation authorities will no longer be able to ignore accountability to their customers.

Applying Lessons Learned

In the aftermath of the ValuJet crash in Florida in 1996, William Langewiesche compiled a roster of lessons which all participants in the aviation transportation system need to learn, not least those responsible for investigating accidents and attempting to prevent their recurrence (La:98):

We can find fault among those directly involved --and we probably need to. But if our purpose is to attack the roots of such an accident, we may find them so entwined with the system that they are impossible to extract without toppling the whole structure....Beyond the question of blame, it requires us to consider that our solutions, by adding to the complexity and obscurity of the airline business, may actually increase the risks of accidents. ...
The ValuJet case...fits the most basic definitions of an accident caused by the very functioning of the system or industry within which it occurred. Flight 592 burned because of its cargo of oxygen generators, yes, but more fundamentally because of a tangle of confusions that will take some entirely different form next time. ...

Paperwork is a necessary and inevitable part of the system, but it, too, introduces dangers. The problem is not just the burden that it places on practical operations but also the deception that it breeds. The two unfortunate mechanics who signed off on the nonexistent safety caps just happened to be the slowest to slip away when the supervisors needed signatures. other mechanics almost certainly would have signed too, as did the inspectors. Good old-fashioned pencil-whipping is perhaps the most widespread form of Vaughan's "normalization of deviance." (see Va:96) The falsification they committed was part of a larger deception --the creation of an entire pretend reality that includes unworkable chains of command, unlearnable training programs, unreadable manuals, and the fiction of regulations, checks and controls. Such pretend realities extend even into the most self-consciously progressive large organizations, with their attempts to formalize informality, to deregulate the workplace, to share profits and responsibilities, to respect the integrity and initiative of the individual. The systems work in principle, and usually in practice as well, but the two may have little to do with each other. Paperwork floats free of the ground and obscures the murky workplaces where, in the confusion of real life, system accidents are born.

Investigation authorities and their investigators must be educated and trained to recognize and expose systemic unsuitabilities, unfitness and irrelevance, and recommend changes even to those regulatory dogmas that have survived unquestioned since their origins. Gerard Bruggink once averred that a principal factor in accident causation is the "...uncritical acceptance of easily verifiable assumptions." More contributory factors might be discovered were we to replace "uncritical acceptance" with more stringent verification.

The Threat: Change or Status Quo?

It has recently become fashionable to invoke esoteric statistics to support conflicting and even contradictory predictions of future aircraft accident occurrences. Some forecasters reckon that a static accident rate will combine with increasing air carrier traffic volume to produce a major accident a week within ten years. (FAA:95a, FAA:95b, FSF:96) Another performed statistical jiggery-pokery to conclude that larger aircraft will actually decrease international air traffic density.[49] (Br:98) Nonetheless, the Flight Safety Foundation's analysis should be a challenge not only to the regulatory authority to which it is addressed, but to investigating authorities as well:

With marginal improvements in the accident rate becoming harder to achieve, the FAA and industry believe that they will need to rethink the fundamentals of their current operating and safety processes in order to improve safety performance. The goal of zero accidents and the challenges of the future environment simply make the status quo untenable....[50]

Current investigation products do not support these objectives. We have no objective measure of whether recommended "fixes" actually work. It is possible --even probable --that a substantial proportion of current "safety" regulations, policies, and operational procedures have no effect on achieving prevention or, worse yet, are actually inimical to safety.

Investigation work products should be the principal sources for identifying the fundamental factors upon which to build aviation system safety improvements. The usefulness of those work products depends on investigations incorporating rigorous methodologies which establish, test and verify a Calculus of Causation.

It is not enough for investigating authorities merely to generate arbitrary recommendations for transmittal to their regulatory counterparts. Recommendations which arise from fallacious "causes" cannot contribute to prevention. Those derived from the findings of competent investigations must be tracked after implementation to verify their quantitative efficacy. Authoritative assignment of bureaucratic responsibilities can bring accountability to the process of accident prevention.

It's time we got moving. We have already fallen far behind time's innovation curve.


(ACRC:95) Aeronautica Civil of The Republic of Colombia, Aircraft Accident Report: Controlled Flight Into Terrain, American Airlines Flight 965, Boeing 757-223, N651AA, Near Cali, Colombia, December 20, 1995. Santafe de Bogota, D.C.-Colombia.

(Be:97) Ludwig Benner, Jr., Introduction to Investigation. (Stillwell, Oklahoma State University Fire Protection Publications, 1997).

(BeRi:91) Ludwig Benner, Jr., & Ira J. Rimson, "Quality Management for Accident Investigations." ISASI forum, Part 1: V. 24, #1 (October 1991); Part 2: V. 25, #1 (March 1992). Sterling, VA., International Society of Air Safety Investigators.

(Br:87) Gerard M. Bruggink, "To Kill a Myth." Proceedings of the Eighteenth International Seminar of the International Society of Air Safety Investigators, Atlanta, Georgia, October 6-9, 198. ISASI forum, V. 20, #4, February 1988, pp. 4-9.

(Br:98) Anthony Broderick, "Cry Wolf." Aviation Daily, March 6, 1998.

(CoCo:90) Irving M. Copi & Carl Cohen, Introduction to Logic, 8th ed.

(New York, Macmillan, 1990) [ISBN 0-02-325035-6]

(De:97) H. William Dettmer, Goldratt's theory of Constraints. (Milwaukee, American Society for Quality Press, 1997) [ISBN 0-87389-370-0]

(FAA:95a) Aviation Safety Action Plan, Zero Accidents...a Shared Responsibility. U.S. Department of Transportation, Federal Aviation Administration, February 1995.

(FAA:95b) FAA Aviation Forecasts, Fiscal Years 1995-2006. U.S. Department of Transportation, Federal Aviation Administration, March 1995.

(FSF:96) "Challenge 2000: Recommendations for Future Aviation Safety Regulation." Flight Safety Digest, V. 15, #6, June 1996. Arlington, VA, Flight Safety Foundation.

(GeHo:97) Thorsten Gerdsmeier, Michael Höhl, Peter Ladkin & Karsten Loer, "How Aircraft Crash," 11 June 1997. RVS Group, Technical Faculty, University of Bielefeld. at

(GeLa:97a) Thorsten Gerdsmeier, Peter Ladkin & Karsten Loer, "Formalising Failure Analysis." RVS Group, Technical Faculty, University of Bielefeld. at

(GeLa:97b) Thorsten Gerdsmeier, Peter Ladkin & Karsten Loer, "Analysing the Cali Accident With a W-B Graph." Presented at the Human Error and Systems Development Workshop, Glasgow, March 1997. (Second Version, March 1997). at

(Go:90) Eliyahu M. Goldratt, theory of Constraints. (Great Barrington, North River Press, 1990) [ISBN 0-88427-085-8]

(Ko:84) Nick A. Komons, The Cutting Air Crash: A Case Study in Early Federal Aviation Policy, 2d ed. (Washington, U.S. Department of Transportation, 1984)

(La:98) William Langewiesche, "The Lessons of ValuJet 592." The Atlantic Monthly, March 1998, pp. 81-98.

(Ld:97) Peter Ladkin, "The Crash of Flight KE801, a Boeing 747-300, Guam, Wednesday 6 August, 1997: What We Know So Far." Article RVS-J-97-06, 16 September 1997. at, guam.html.

(Le:42) Jerome F. Lederer, (Director, Safety Bureau), Memorandum to the Civil Aeronautics Board dated June 12, 1942. Subj: "Basic system of analyzing aircraft accidents", pp. 2-3.

(Le:63) Jerome F. Lederer, "Methodology and Patterns of Research in Aircraft Accidents." Annals of the New York Academy of Sciences, V. 107, Article 2, pp. 670-685, May 22, 1963.

(Le:92) Jerome F. Lederer, "Is Probable Cause(s) Sacrosanct?." ISASI forum, V. 25, #1, March 1992, pp. 8-9.

(McFr:93) Daniel McNeill & Paul Freiberger, Fuzzy Logic. (New York, Simon & Schuster, 1993) [ISBN 0-671-73843-7]

(Mi:43) John Stuart Mill, A System of Logic, 8th ed. 1843. (London, Longmans, 1873) Quoted in (GeHo:97).

(Mi:91) C.O. Miller, "Down with Probable Cause." Proceedings of the Twenty-Second International Seminar of the International Society of Air Safety Investigators --Canberra, Australia, November 4-7, 1991. ISASI forum, V. 24, #4, January 1992, pp. 120-135.

(Ri:83) Ira J. Rimson, "Are These --the Same Accident?". ISASI forum, V. 16, #3, 1983.

(Ri:97a) Ira J. Rimson, "Establishing Quality Controls for Forensic Expertise: Case Hardening by the Scientific Method," Forensic Accident Investigation: Motor Vehicles --2, ed. Thomas L. Bohan (Charlottesville, Lexis Law Publishing, 1997) [ISBN 1-55843-680-5]

(Ri:97b) Ira J. Rimson, "1997 --New Challenges." ISASI forum, V. 30, #1, January - March 1997.

(USAF:96) United States Air Force, Aircraft Investigation Report T-3A (SN 93-0584), Air Force Academy, Colorado, 30 September 1996. Air Force Education & Training Command, Randolph AFB, Texas, 1996.

(Va:96) Diane Vaughan, The Challenger Launch Decision. (University of Chicago Press, 1996) [ISBN 0-226-85175-3]


The inspiration for this paper arose from the frequently contentious (but always polite) discussions which originated from Ludi Benner's Investigation Process Research References Library [at] and subsequently expanded into a forum within which persons with widely disparate investigation expertise were encouraged to shed the constraints of tradition and indulge their imaginations in brainstorming new approaches to both investigation and prevention. I acknowledge the contributions of the Investigation Process Research References Library IRRegulars, from whom many of the ideas in the paper arose, and who motivated my search for some other tool than the ubiquitous hammer[51]:

  • Michael Allocco; FAA Office of System Safety

  • Ludwig Benner, Jr.; Investigation Researcher & Former Chief, NTSB Hazardous Materials Division (Retired)

  • Hughes Chicoine; Certified Fire & Explosion Investigator (including arson), Montreal, PQ, Canada

  • Steve Corrie, FAA Office of System Safety & Former NTSB Major Accident Investigator

  • Professor Peter B. Ladkin; University of Bielefeld, Germany

  • Professor Nancy Leveson; University of Washington

  • Dr. C. O. Miller; Attorney, Engineer and Former Director, NTSB Bureau of Aviation (Retired)

  • Jim Stewart; Director-General, System Safety, Transport Canada (Retired)

  • Frank Taylor; Director, Cranfield University Aviation Safety Centre, U.K.

  • Professor William Waldock; Embry-Riddle Aeronautical University - Prescott

  • Richard H. Wood; Former Director, U.S.C. Aviation Safety Programs & Colonel, U.S. Air Force (Retired)

  • Dmitri V. Zotov; Lecturer in Accident Investigation, Massey University, N.Z.


    [1]. Public Law 69-254, 44 Stat. 568
    [2]. Komons, Op. Cit. pp. 1-2.
    [3]. U.S. Senate Resolution 146, 74th Congress, 2d session (in Komons, Op. cit. p. 28)
    [4]. Bureau of Air Commerce "Statement of Probable Cause Concerning an Aircraft Accident which Occurred to Plane of Transcontinental and Western Air, Inc., on May 6, 1935, near Atlanta, Macon County, Mo.", in Air Commerce Bulletin, Vol. 7 (July 1935) (cited in Komons, Op. cit. p. 32.) Note that the "causes" objective has progressed from "...the probable cause or causes of accidents..." in the 1934 legislation (see p. 8 infra), to "...probable direct cause..." in the Board's principal conclusion., suggesting that semantic vagueness reigned even at this early stage of formal governmental investigation.
    [5]. Komons, Op cit., pp. 32-33.
    [6]. Id. pp. 32-33.
    [7]. Id. pp. 35-57. A principal TWA argument rebutting the charges of regulatory violation was that the regulations had been improperly issued, and therefore were not legally in force at the time of the accident.
    [8]. Id. pp. 58-60.
    [9]. Plus ça change, plus c'est la même chose.
    [10]. Id. p. 67.
    [11]. Id. pp. 76.
    [12]. Id. pp. 51, 66.
    [13]. Id. p. 82.
    [14]. Id. p. 84.
    [15]. Had the scientific skeptics who drafted the original Civil Aeronautics Act of 1938 been better acquainted with applying formal logical analyses to scientific observation they might have better defined the terms of the legislation. Legal and scientific mind-sets are diametrically opposite: laws are man-made constructs; they tend to be absolute and morality-based to fulfill their functions within the adversarial legal system. Science, on the other hand, is relative and fact-based, and functions within a collegial system. (Ri:97a)
    [16]. Definitions from Random House Dictionary of the English Language. New York, Random House, 1967.
    [17]. P.L. 73-418, 48 Stat. 1113-1114.
    [18]. P.L. 75-706, 52 Stat. 973 et seq.
    [19]. The tasks of reporting the facts, conditions and circumstances relating to accidents and recommending actions to prevent recurrence are included in all subsequent legislation, and not repeated herein.
    [20]. P.L. 85-726, 72 Stat. 731-811.
    [21]. Id. ¶102(e).
    [22]. P.L. 89-670, 80 Stat. 931-950.
    [23]. The was the first time that a requirement to determine "probable cause" was imposed on transportation modes other than aviation.
    [24]. Title III of P.L. 93-633, 88 Stat. 2166-2173.
    [25]. Emphasis added in all the preceding quotations.
    [26]. Fifty years later (Le:92) Lederer reflected on the origins of "Probable Cause" and suggested that much of the semantic controversy might be obviated by adopting "Findings", "Significant Factors" and/or "Recommendations" instead. He once again failed to address the definition issue, which would remain an enduring source of vagueness absent more precise specification.
    [27]. Black's Law Dictionary, 6th ed., St. Paul, West Publishing Co., 1990.
    [28]. In 1972, when he was Director of the NTSB's Bureau of Aviation Safety, C.O. Miller briefly instituted a definition of "probable cause" more suitable to the purpose of investigation: "...a description of the physical nature of the event producing injury or damage and amplified by those cause-effect relationships about which timely preventive action can be taken." In 1973 the Board's newly-appointed General Manager exerted his arriviste authority by superseding the definition unilaterally. (Mi:91)
    [29]. Investigating agency managers and public affairs staff might productively adopt Ladkin's analytic technique for use in responding to party and press allegations.
    [30]. The following discussion relates specifically to the United States and U.S. investigation authorities. It is equally applicable to the conduct of investigations by any entity, anywhere, which accepts the universality of acknowledged constructs of science and logic.
    [31]. Quoted in (De:97)
    [32]. Goldratt, Op.cit., p. 4.
    [33]. Id.
    [34]. Id., pp. 6-7.
    [35]. Dettmer, Op. cit. p. 14.
    [36]. The "gap-filling" process may include analyses of alternative probabilities in cases where facts sufficient to support an unequivocal logical conclusion cannot be ascertained. See, e.g., (BeRi:91) and (HeBe:86).
    [37]. Copi & Cohen (Op. cit. pp. 377 - 380) posit that "We can legitimately infer cause from effect only in the sense of necessary condition. And we can legitimately infer effect from cause only in the sense of sufficient condition. Where inferences are made both from cause to effect and from effect to cause, the term `cause' must be used in the sense of `necessary and sufficient condition.'"
    [38]. Dettmer, Op. Cit., p. xxii.
    [39]. Id. pp. 50-51. Example: "`I have appendicitis' might be offered as the cause of the effect `I have a pain in my abdomen.' But if the cause is really valid, we might also expect to see a couple of other effects: `I have a fever' and `My white cell count is elevated.'"
    [40]. NTSB Accident No. NYC-80-F-HJ03, Form 6120.4, Part U: Narrative Statement. [in (Ri:83), p. 12]
    [41]. Id. , NTSB Form 6120.1: Pilot/Operator Aircraft Accident Report
    [42]. In (Ri:83), p. 13.
    [43]. (ACRC:95)
    [44]. (GeLa:97b), p. 17 of 22.
    [45]. Dettmer, Op. cit., Chapter 2.
    [46]. Investigators who are appropriately educated and trained can incorporate logic testing during the investigation process, where it can be much more efficiently employed to detect logic problems before the report is publicized and the adversarial process reacts.
    [47]. The Aviation Disaster Family Assistance Act of 1996: Title 49, U.S. Code, Subtitle II, Chapter 11, Subchapter III, §1136. Actions have been initiated to expand the scope of the government's family assistance role to include surviving passengers, accidents involving U.S. flag carriers outside the Unites States, all commercial (for hire) operations and flight crew members' families.
    [48]. Id. ¶1136.(e): Continuing Responsibilities of the Board
    [49]. As the larger airplanes will presumably accommodate more passengers, each crashed airplane should therefore result in more deaths/airplane, so that if the relative ratios are complementary, there will be the same number of crash victims for fewer airplanes lost. Mr. Broderick should recall that airplanes don't complain, sue, vote, or demand boffins' heads on platters.
    [50]. (FSF:96), p. 13.
    [51]. When the only tool you own is a hammer, every problem begins to resemble a nail. -- Abraham Maslow

       [Return to Home Page ] [ Go to IPRR Forum ]

    Prepared for Internet Posting by Starline Software Ltd.