Search This Blog

Tuesday, August 09, 2011

The Complexity Issue/Type III IV Errors

An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup.

H. L. Mencken

Thought experiment: Executive Bill believes that siRNA can be used to knock out TNF alpha and reduce swelling in joints in rats and thus it will do the same in human beings.

Think of Executive Bill as a Martian who has come to the conclusion that humans will prefer rose soup over cabbage soup. He sends his little Martians out to prove him right.

By setting up such a complex hypothesis Bill has failed to acknowledge that the siRNA may not even be the proper molecule that mimics a naturally occurring small piece of RNA that will reduce gene expression. Smell alone may not predict food preference.We fool ourselves all through the hypothesis testing and reach the end point (where Executive Bill demands a full report in his office) with the false assumption that a human being should see a reduction in joint swelling. Executive Bill wants to know one thing, did the human see a reduction in joint swelling? Did the humans prefer rose soup?If the human sees an improvement, Executive Bill is vindicated. If there is no improvement, Executive Bill sends the scientists back into the lab to tweak the system.

In colloquial usage type I error might be called "failing to believe the truth". In biotechnology this is a common error when desired results do not match actual outcomes. Executive Bill will reject the truth (siRNA against TNA alpha has no effect on joint swelling) and send the scientists back to the lab to get the desired results. The Martian leader will send the little Martians back to earth to get them to prefer Rose Soup.

Type II error is "believing the falsehood". This would be the case if the data showed a reduction in joint swelling from the siRNA treatment when the siRNA had nothing to do with the reduction. Executive Bill would eagerly accept this type of error.

The Martian leader would eagerly accept any data that showed humans preferring Rose Soup. Let's say that the problem came from a mislabeling of the soups or fraud was committed by an ambitious little Martian. The desired results were obtained. They were false. They were accepted. Executive Bill and the Martian leader accept what they are being told.

In these two examples, siRNA and Rose Soup, we (the Cargo Cult Scientist) are saying that the non-erroneous outcome would be that siRNA has no effect on TNF alpha or anything in the human body, and that human beings will not eat Rose Soup. Our antagonists, Executive Bill and the Martian leaders' desired outcomes should be proven false. No matter what the outcome, they will only be accepting validation of their hypothesis. The easiest path to success, as defined by Bill and the Martian, is a type II error. Type I errors will be made until a type II can be arranged.

Now let's enter a superior being I will refer to as God. It is the Cargo Cult God and he needs to keep his subjects ignorant for amusement purposes. He delights in the folly of the minds who keep themselves fooled at all times. The Cargo Cult God wants to explain how Executive Bill and the Martian leader reach their status in their respective groups and how they maintain their positions. First, they are bullshitters and thus would use the truth if they could get to it. But getting to the truth would be a random act since they do not know how to conduct research. They begin from ignorance, either type I or type II errors or they are randomly yet unknowingly correct. They then select a desired outcome and draw a line from A) type I or II error or random correct assumption, to B) desired outcome. The desired outcome however cannot be attributed to the path of reasoning that is depicted by the line from A to B.

In 1974, Ian Mitroff and Tom Featheringham argued that "one of the most important determinants of a problem's solution is how that problem has been represented or formulated in the first place".

They defined type III errors as either the error of having solved the wrong problem when one should have solved the right problem or the error of choosing the wrong problem representation when one should have chosen the right problem representation.

In other words, Executive Bill set out to solve the problem of how to make siRNA into an anti-RA drug where he should have tried to solve the problem of determining the possibility of using siRNA as a drug in the first place. Maybe siRNA cannot survive in the human body.

Likewise, the Martian leader wanted to prove that Rose Soup would be preferable to Cabbage Soup. He should have conducted research on the relationship between the human senses of smell and taste.

In 2009, 'Dirty Rotten Strategies' by Ian I. Mitroff and Abraham Silvers was published regarding type III and type IV errors providing many examples of both developing good answers to the wrong questions (III) and deliberately selecting the wrong questions for intensive and skilled investigation (IV). Most of the examples have nothing to do with statistics, many being problems of public policy or business decisions.

The human body is complex. Biotechnology has yet to accept this fact. The study of type III and IV errors has been handed a gift in the form of medical research. How did we get to the point where we are today? 90% inaccuracy in our scientific journals? Publication retractions on the rise for unknown reasons.

The desired (i.e., non-erroneous) outcomes of a test are called true positive meaning "rejecting null hypothesis, when it is false" and true negative meaning "not rejecting null hypothesis, when it is true". What is the case when the desired outcome is not defined as non-erroneous?

No comments: