Unverifiable "Knowledge" is Demonstrably Trivial.

On April 14, 1994, two American Air force pilots in F15 fighter planes misidentified two American Army helicopters operating in Iraq and shot them down killing all 26 people aboard. Because the lead pilot was not able to evaluate his personal belief using external evidence, 26 people are dead.

In an interview after the incident, the lead pilot, the pilot that fired on the helicopters, reported that he had no doubt they were Iraqi helicopters when he shot them down. The wingman, who was supposed to confirm, took the lead pilots word for it and the AWACs officers in charge of command and control believed the lead pilot even though they had information that the helicopters were scheduled to be there.  If they would have taken the time to look at all the evidence, it would have been obvious that they were not Iraqi helicopters. His personal belief needed external verification.

After the fact, The secretary of Defense identified four causes of the incident
- Pilots mis-identified the Black Hawks
- The AWACS crew failed to intervene
- The helicopters were not well integrated into the task force
- The "Identification Friend or Foe" System failed. However it worked properly, it was just not configured properly.
Retired Lt. Col. Scott Snook wrote about his investigation of the incident in a book called "Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq".
Wikipedia has an entry on it as well. 1994 Black Hawk Shootdown Incident
I have included a Dr. Snooks elaboration on the causes at the end of the article.

Though this case is usually studied as an example of how decision making in an organization fails, I want to focus on an aspect referred to in this statement by the AWACS pilot, since the helicopters were shot down by the man who held the belief.
"AWACS crew members added in their testimonies that once Wickson (the lead pilot) and May (the wingman) visually identified the helicopters as hostile, all responsibility for the shootdown passed to the F-15 pilots."
Peterson, "Court-Martial Begins in 'Friendly Fire' Deaths in Iraq", Piper, Chain of Events, p. 214–215.

Though the AWACs officers had prior information about the helicopters, the AWACS pilots took the word of the lead pilot when he said he saw Iraqi helicopters.  In fact the AWACS pilots were impressed by the lead pilots ability to not only identify that the helicopters were enemy but that he could identify what kind they were.  The AWACS officers placed a higher value on the lead pilots belief than the data that they had on file.  Additionally, the wingman in the other fighter jet did not confirm that the helicopters were enemy, but only confirmed that there were two helicopters.  The wingman believed that if the lead pilot believed they were enemy helicopters, then they must be.

The pilots expected that if they saw helicopters when they were doing their initial patrol, then they could only be  Iraqi helicopters.  When they saw the helicopters, they perceived and inferred what they expected. They believed what they thought they saw.  They were certain that they had knowledge of two Iraqi helicopters and were justified in shooting them down.

We are fundamentally bounded in our rationality. We are bounded by the physical architecture of our brain, our experiences, what we already know and believe, our feelings, our self-interest.  We don't examine every possible option, or every scrap of data before we make a decision. We adopt heuristics, mental shortcuts. Usually, when the stakes are low and mistakes happen, whatever harm is done is tolerable.  But only when the stakes are high is it obvious that procedures need to be in place to correct for cognitive bias and human error.

Some of the Biases that I can see that were obviously involved in this incident are
- Overconfidence bias: Human beings are systematically overconfident in our judgments.
- Confirmation Bias: Human beings tend to gather and rely on information that confirms thier existing views and tend to avoid or downplay information that disconfirms what we think is the case.
- Accepting the word of someone based on acquaintance: the wingman worked closely with the lead pilot and had a lot of respect for his skills.
- Deference to Authority: the wingman and the AWACs officer did not question the mis-identification by the lead pilot, though the AWACs officers had information about the identity of the helicopters prior to the shootdown.
Cognitive bias skews our thinking and makes it hard to come to correct conclusions, make good decisions, and formulate "Justified Beliefs".  That is why it is important to use methods to counteract cognitive bias.  One of the first to be formulated was "the scientific method".  If the scientific method is used as it is intended, it will counteract many of the effects of cognitive bias.  If a persons belief system makes it difficult to trust the scientific method, then at least it should be agreed on that things in general need some definition and boundaries, and those definitions and boundaries should be kept in mind when deliberating.  Human error should always be considered likely in anything a human does.

List of Cognitive Biases from Wikipedia.
[Wikipedia should not be considered authoritative, but a good place to start]

In an interview after the fact: the fighter pilot reported that he had no doubt they were Iraqi helicopters when he shot them down. The black hawks did not even cross his mind when he made the decision.The lead pilot "knew" that they were Iraqi helicopters.

What is Knowledge?
Knowledge can be of how to do something, knowing a person, or a place, or propositions.  This discussion will be limited to "Someone knows that a Proposition is true or is a fact".  Briefly stated, "S knows that P" or "The lead pilot knew that they were Iraqi helicopters" or "I know that God exists because of the inner witness of the spirit".

Epistemologists have wrestled with the idea of "rightly justified belief" as a definition of knowledge, but they always come to the same point of disagreeing on "what makes some knowledge or belief preferred over another?".   Can a consensus be reached on a standard for determining what makes some knowledge or belief more preferred or "better" than another?  I think an external standard has already has been found and has been put into practice in fields such as Public Safety and Public Health, civil engineering and such, for many years.  I think the strongest work in Epistemology is being done outside the domain of philosophy and is not being done by philosophers.

Justified Beliefs
The fighter pilots belief about the Iraqi Helicopters was not "rightly justified".  It was a weakly justified belief on little evidence that was of a type that was likely to be in error.  While its true he had to make a time critical decision, and while a military hearing found him not culpable for anything other than making a mistake, some points in the time-line of the event were identified that could have prevented the shootdown had some action been taken to account for the likelihood of human error.  The team could have considered the external data they had.  Someone should have asked the lead pilot "What makes you so sure those aren't the Army Helicopters that we expect to work with today?"

Introduction to some key concepts in Epistemology
Key terms in epistemology are Belief, Truth, Justification, Evidence, Reliability, Internalism, Externalism, Foundationalism, and Coherentism, but unfortunately, some of the key terms in use are largely undefined. In reading through the Epistemological literature, it is obvious that in some cases the terms and words are minced until they are no longer useful. It results in some philosophers positing obviously improbable and unknowable "thought experiments" as analogies to use in deliberation while presuming that the analogy "fits".  The "Brain-in-a-vat" thought experiment is a famous one, and Berkeleys "we all exist in the mind of God" is another. In reality, a thought experiment that breaks down the boundaries so much so as to permit "fantasy" is not very useful.  We have to find a reliable way to exclude "fantasy" and more importantly "Human Error".

In order to make progress in defining what is knowledge and what is not, some standards need to be agreed on. If language is insufficient to capture a definition of knowledge, yet everyone seems to "know" things and use that knowledge to interact in the world, then "What knowledge is" is not as important as "what action are you taking on what you think you know"?  More importantly "Will it cause any harm?" Does the possible harmful outcome outweigh the risk?  What justifies a person in taking some action based on what they think they know?

Having The Humility To Accept The Possibility Of Error In Perception
Lets look at the shootdown incident through an Epistemological lens and try to come up with why one variety of knowledge is more preferred than another.

1. The pilot thinks that if he see's helicopters where he doesn't expect them, they will be enemy.

2.  He see's helicopters and he didn't expect them.

3. He is an expert, he gives his opinion to his team.

4.  His team defers to his expertise rather than checking the data

5.  He shoots down the helicopters and kills twenty six people

6. If he had externally verified his beliefs with the external data that was held by the AWACS plane, he would not have shot down the helicopters.


But we all already knew that didn't we? This principle is already presumed in society.  Its just that some of us have to deny it to make a system of beliefs work.

Therefore, generally, unverifiable internal knowledge is trivial compared to externally verifiable knowledge. This principle is accepted as a sound principle and expected to be used to make judgments.  To not use this principle can be considered negligent.

Using an external standard of minimizing harm, I have shown that the relative value of Internal Knowledge is less than the value of External knowledge.

Equivocation Of The Word "Justification"
Paul's use of Justification by Faith means "justified to join the christian community of believers" by faith in Christ not by being Jewish, joining the Jewish community or following Jewish laws. Its not a knowledge claim at all, its membership criteria. So "Justification" is membership criteria for whatever it is that is being assessed for inclusion in a category. 

It would fit the task of assessing whether a belief should be considered knowledge. Justification for inclusion in the Jewish community is quite another thing than Justification by faith of knowledge of God.  When someone says that they are justified in a belief in god by faith, then they are making a circular statement.  Faith is a belief in god, and belief in God is Faith.  Or do I have a misconception?

Belief does not seem to be the preferred way to acquire knowledge because it doesn't counteract the likelihood of Human Error.


Here is a brief summary of the elaboration on the findings of the Secretary of Defense from Snooks Book with my notes in brackets and curly braces.

- The helicopters were army, the fighter planes were Air Force. They did not effectively share information with each other.

- The fighters pre-flight papers did not indicate the helicopters were going to be there

- Before anyone can come into the zone, fighter pilots fly around the zone ensure there are no enemy  and make the call or decision that it is clear.  The US Helicopters were in the zone before the pilots had finished their initial flight. The pilots did not expect to see Helicopters in the area until they had reported that it was clear of enemy.
- An American AWACs plane was in the area whose task it was to ensure monitoring and control of the area. It knew about both fighter planes and helicopters were in close vicinity to one another.
[Part of the verification process]
- The Helicopters could not respond to the IFF signal in the affirmative because they were not using the same code as the fighters.

- The pre-flight papers did not indicate the helicopters would be there

- Standard operating procedure dictates the Jets should be first in the zone to ensure it's safe
- The US Helicopters were outfitted with extra fuel tanks that caused them to resemble Iraqi helicopters.

- The pilots were not familiar with the new equipment configuration.

[Human Error]
- The fighter pilot and wingman did not verify or confirm each others conclusions when they conducted the visual assessment though they INFERRED each others confirmation due to ambiguous language usage.

[Cognitive Biases]
- Interview after the fact: the fighter pilot reported that he had no doubt they were Iraqi helicopters when he shot them down. The black hawks did not even cross his mind when he made the decision.
{He did not have access to information he needed that was stored in his brain.  For some reason, his cognitive processes did not access what he had in memory and bring it to consciousness. This is a common cognitive limitation that occurs to people on a daily basis, and will probably occur to the reader today or this week.  }

- Interview after the fact: The wingman said that when the fighter pilot said he identified them as Iraqi helicopters, he believed him.
{In social psychology, Research in persuasion has created four major categories  of persuasion variables.  In this instance the wingmans decision making was biased by the variables in the "Communicator" category.  They all liked, trusted and viewed the pilot as an authority.  Research in persuasion  has demonstrated that people are more easily influenced by people they like, trust, consider and authority or are attactive}
- Persuasion
- Weapons of Persuasion

- Interview after the fact: After the fact, The AWACs officers trusted the Fighter pilots opinion over the data, so they did not challenge them. {Same as above}