Friday, May 05, 2006

Provisional answers, decision-making and over-analysis Part II

Part I

If you are looking to establish what is the once-and-for-all truth of a thing, you will be waiting around forever. If you are looking to make the best decision possible, you will take the information available, the best information available, and make a decision based on that. But you will realize that that decision is provisional, that it is subject to change when better information comes along. The good thinker will be ready to make the change when that better information comes along.

The other problem comes with the fact that there is no way to decide one solution is better than another in the abstract. The trash bin of history is full of ideas that looked very good on paper and that were even brainstormed and had the support of smart people. But in the end how do you know that it is any better than any other idea that might have been thought up? On paper, the arguments can be made and those arguments can be extremely rational. But the only way you can tell is if it actually works on the ground or when you "fire it up." That means that good critical thinking and good solutions are always made with an eye on the ground or, to put it another way, with an eye to context. If it doesn't work in reality, or in the reality you are working with (the market, the environment, etc.) it is not a good idea no matter who was behind it at first, no matter the brain power that went into it at first and no matter how much money it took to come up with at first. If it does not work on the ground or in context, it is a bad idea. Period.

This may seem to be obvious but again, history is full of ideas that looked good in the brainstorming sessions and worked out wonderfully on paper but crashed and burned when they were implemented.

Isn't that something we won't be able to figure out until we actually fire the thing up? We cannot completely tell that yes. But there is a lot we can understand about it even in the developmental stages if we do keep the context in mind from the start.

Provisional answers, decision-making and over-analysis Part I

When you push people to think critically, which I tend to do both my students and those in business, there are questions that tend to come up. By critical thinking, I mean everything being subject to evaluation and reevaluation a point made in a few posts already. There is no final answer in this way of thinking, there are only provisional answers. But students and businessmen, who aren’t used to thinking this way, come back with a question about over-analysis. Is it possible? Isn’t that a bad thing? Here’s my answer:

What about a tendency to overanalyze? Doesn't any push to think critically mean that people will be prone to overanalyze things? Won't people be more likely to overanalyze if they are pushed to think critically about the issues? And is this a good thing?

My position is that you can't really overanalyze a thing. If you use analysis, it is the strength of the evidence that is the issue and the strength of the conclusions based on that evidence and not anything else. When the term over-analysis is used, people are really saying something like "second guess." This is really a lack of confidence in the analysis that has been made, which can be a problem and is a bigger problem with the argument I make about the change in what are considered to be facts. If facts have a tendency to change, which they do, how then can I be confident that the conclusion I make will hold up?

The way around this is to say that it holds up now if it is based on the best possible evidence and analysis. That it might not hold up later is simply an invitation to look for information that falsifies the position that you have taken now. Companies should do that anyway and people like Tom Peters have been shouting at the top of their lungs for years for companies to do this. But what actually happens is that a position is taken that hardens into something close to a fundamental truth about the nature of the universe and is only revisited when there is a downturn in business (or a defeat on the battlefield.) Not a good way to manage.

A good thinker however will be looking for that information which will contradict the position already taken. When that information is found the conclusion can be drawn based on that new information and a course correction made. But isn't that new information subject to the same defect that it might later be contradicted by something else? Yes, but that simply means that the thinker must be thinking all the time and be constantly on the lookout for better information. That makes thinking critically one of endless effort. And that is one reason why people don't want to do it.

Part II

Monday, May 01, 2006

What do we know--and when did we know it?

When someone tries to prove something to me and I don't agree I always ask "Were you there? Was I there?" If we were not there to experience the facts I suppose it is simply an assumption.

You could take that skepticism a bit further along couldn't you? Will the sun come up tomorrow? If you say yes and your criteria for knowing something is that you are there and experience it yourself, then how could you say that it will? You aren't there in the future right now to be able to make that statement are you? (And if you say that the past is the key and that you were there for past incidences of the sun coming up, how is it that past incidences of a thing happening necessarily means that the thing will happen again? If a chicken is fed every day at a poultry farm, wouldn't his expectation be that the very next day he will be fed again? That day just might be the dressing out the meat day. This is the induction problem that Hume identified. And, by the way, how many times have you actually seen the sun come up?) If I were to ask you if the sun will come up tomorrow in the Ukraine, could you tell me it will based on your experience? Or will you even be able to say that I am in Ukraine or that there is even a Ukraine at all?

This is the problem not only for history but also for just about every piece of knowledge that we say we know. Was the atom split? Do you really know? Have you ever seen one split? How could you tell if an atom is split even if you were there to experience it? And if you see the mushroom cloud from an atomic explosion, an explosion, by the way, which not many have seen in person, can you be sure that it is because of the splitting of the atom? Aren't you taking people's word for that? The same thing can be said about: anatomy (how many have ever seen a human heart in person--pictures don't count because they can be falsified); geography (how do you know that there is such a thing as a France or a Russia, even if you are there on the ground?); the birth of babies you haven't seen yourself; illness ("That cold is caused by a virus" says the doctor. How do you know? Have you ever seen a virus? Does a microscope count? Is it a direct experience? Isn't there an assumption that the microscope actually lets you see microscopically small things? Do you know that is true from your experience? And even if you have seen a virus how do you know that the cold is caused by that virus or a virus?); political history (how do you know that George Washington defeated the British, that there was a Revolutionary War in the first place, or that there was even a "British" or a George Washington? His home is there with his pictures in it but how do you know that it was really his home? or how do you know there was a Constitutional Convention or that there was even a signing of the Declaration of Independence at all? If we have a document does that prove that it was in fact signed as is purported to have happened?); psychology ("The brain is the seat of the mind." Have you ever seen a brain, in person that is?--pictures can be falsified and if you see a brain without having seen it in relation to a person, that is, having been exposed from a cutting into the skull, how do you know that it in fact comes from the skull?); love (how do you know that your husband or wife loves you? You can't get into their minds can you to know?); or any other thing that we do not know from firsthand experience, which is about everything we know.

The point is that we have to rely on others and, to some extent, on the honesty of others for the very knowledge that we have. If we had to rely on firsthand experience for it, that knowledge would be severely limited.

All of the information that you have learned in school, for example, has been information that you yourselves have not verified or experienced firsthand. All of it. (If you say, "the same thing happened to me at work that I learned about in class" is that the same thing as being able to generalize about it? The knowledge you have learned is generalized and generalizable to most other situations. If you weren't there for these other situations then you can't say firsthand.)

If that sounds a lot like a sort of faith, guess what? It is impossible to be an absolute skeptic and learn. You must have faith in someone's abilities or someone's knowledge or his truthfulness to start learning in the first place. Every discipline, including science, requires the learner to suspend skepticism and to accept things because "they just are" for the beginner to begin learning. In my critical thinking class, I take the position that we are a little too believing in our school experience, believing too much in the absolute nature of our knowledge and our discipline and our instructors, for our own good. This is because knowledge tends to change quite a bit even in the sciences. But the fact is that it is believing nonetheless.