Song of the Week: Faithless
Letter To My Kids: Part 2

Letter to my Kids Part 1

Equality has invited me to post a letter I've been working on to my kids regarding the LDS church and my views on it.  This letter was begun to help codify my thoughts, not just regarding the doctrines and history of the church, but how we think and why we accept and believe facts and propositions about our world.  I intend to give this to my children when I judge them to be at an age where they can understand what I'm talking about and hopefully make better decisions because of it, probably sometime later in their teenage years.

For some background,  my wife is raising my children to be faithful LDS.  A few years back, after significant research, investigation, and soul-searching, I came to the conclusion that the church was not what it claimed to be and now no longer participate.  After some difficult times, my wife and I have a mostly stable relationship and she continues to take the children to church, and I generally support her in that.  While I don't necessarily view the kids being LDS as a "bad" thing, I do want them at some point to be made aware of many issues about the church that I never knew growing up that really do affect the way you view the church, and perhaps life in general.  Hence, this letter.  Enjoy!

(Part 1)

Dear <Daughter or Son>,

I'm giving this letter to you because you are now old enough and mature enough to hear about some things that I feel are very important. Yes, this involves religion, but more generally, I want to talk about the philosophy of how we understand our world around us. This is important because the things we believe and our general world view affect our decision making, our attitudes about other people, and how we interact with them and the world. First of all, let me tell you that I love you very much, and that is the reason I am writing to you now. It's taken great effort to put this together, but I feel it is worth it. I want you to be as fully informed about the world as you can be. Second, my purpose in writing this is not to de-convert you from Mormonism or any other belief system. I want you to do what will bring you long-term happiness. I truly mean that. However, I feel it is very important that you do whatever you do with your eyes open and with a full set of facts. It is very unpleasant to find out some uncomfortable truths after you have made major life decisions based on incomplete information. Finally, I would like you to have an understanding of why I have come to the conclusions I have about religion, whether you choose to agree with them or not.

Given this, there are two main areas I wish to talk about. One is how we "know" things. Where does our information come from? How and why do we decide to accept some things as "facts" and disregard others? As Daniel Patrick Moynihan (former US Senator) has said, "Everyone is entitled to their own opinions, but they are not entitled to their own facts." The second area I want to cover is some facts and issues about Mormonism, its history, its early members and leaders, and current organization that you may not have heard about in church or Seminary. I feel it is important that you know about these things because they can be rather surprising or even disturbing to those who have never heard it before. I will give you the best information that I've been able to discover in my research, while doing my best not to tell you how you should feel about or interpret it. You should know that there are many faithful Mormons that know a great deal about the history of the church, including its "warts," and still choose to believe and practice the Mormon faith. It is certainly not automatic that one will leave the church once one knows about the issues. But most importantly, all truth should stand up to scrutiny, and one should never be afraid to ask questions and examine new information. I firmly believe that people should be given a full set of information and allowed to make up their own minds. In business, this is called "full disclosure." In government, it's called "open process." I wish to do the same, if possible, for the LDS church.

This will be rather lengthy because it's not something that can be described in few minutes. We need to lay some groundwork before we get into the details. Take your time, but I ask that you please hear me out. I am more than willing to talk in person about any of this if you would like.

Knowledge and the Brain

Epistemology is a branch of philosophy that tries to answer the question "how do we 'know' things?" Why do we accept some information and reject others? Why are some people credible and others not? How come we often react stubbornly to new ideas? There has been a lot of research done on this and some fascinating patterns have been discovered about human behavior.

The brain is a magnificent instrument - probably the most complex machine that we know of. While we have only scratched the surface in understanding how it works, cognitive scientists and psychologists have learned a great deal about how our brains operate and how people think and learn. Our brains are the result of millions of years of evolutionary adaptation to survive in a hostile environment. Many of our emotions are remnants of behaviors that worked well in a that milieau, but may be detrimental or just confusing now. Understanding this can give us an advantage in understanding ourselves and in being able to act better with others. For example, anger is a great motivator to action, but it can also hinder rational thought and ruin personal relationships. There are a number of thought patterns/behaviors that cause us to make mistakes about what we think we "know." I'll cover a few of them in the sections below.

Confirmation Bias

One psychological characteristic that has been extensively studied is called the "confirmation bias." This is essentially where people are reluctant to change their mind about things, or are not open to new information. The first idea that one thinks of is the one that tends to stick. This has been born out in many experiments with a variety of people. Confirmation bias is actually useful day-to-day because we don't have time to be constantly evaluating every bit of information that comes along and making up our minds over and over. That would be rather inefficient. But it also means we are often slow to change our minds when confronted with new information.

One cause of confirmation bias can be explained by the work of V.H. Ramachandran, a cognitive neuroscientist. He has found through studying damaged brains and brain scanning techniques that there are a couple of major centers in our brain responsible for building our "world view." One gathers and processes new information, and the other contains a "model" of the world view, including our beliefs about it. Between the two is a "filter" of sorts. It appears to be the job of this filter to screen information, or even ignore new input from the "gathering" center. This makes us more efficient, because our environment usually doesn't change in a substantial way, so why waste time rebuilding the model? But it can also prevent us from accepting new things that are relevant if we're not aware of our bias. I bring this up only to help you be more self-aware: there is "wiring" and structure in our brains that tends to make us behave in particular ways, and being consciously aware of it can help us use it to our benefit. Confirmation bias plays an important role in our acceptance and rejection of new ideas and information. So when confronted by new and/or contradictory information, it pays to ask oneself "Why is this information disturbing? Is it the information itself, or the fact that it contradicts what I thought I knew?"

I read an interesting/funny quote from Scott Adams, the cartoonist for the "Dilbert" comic strip. He said: "As I’ve mentioned in this blog, when people associate with a point of view, they begin to lose objectivity. For example, if you were President of the Unicorn Association of America, and spent your days explaining how wonderful unicorns are, you would become married to that viewpoint. If 400 peer-reviewed scientific articles suddenly appeared indicating that all unicorns are pedophiles, you would be unable to accept that evidence. That’s how normal human brains work, i.e., crappily." This is a restatement of the confirmation bias. People become more invested in a point of view when they publicly espouse that position and spend time and effort endorsing it. This makes it harder to change your mind, even subconsiously.

Evaluating and Accepting Information

One of the best tools now known for evaluating ideas and information is called the "Scientific Method." You've already learned about this in school, but I want to be sure we're on the same page. While there is some argument, even among scientists, about the details of exactly what this means, in essence the scientific method is to (1) create a hypothesis, (2) perform experiments with measurable results that either confirm or dis-confirm the hypothesis, then (3) evaluate the results and decide whether the hypothesis was accurate. A couple of attributes distinguish a "scientific" experiment from a "non-scientific" experiment. First, experimental results must be measurable and repeatable. Second, the results should be such that a "disinterested" party may perform the experiment and get the same result. The state of mind (hopeful or skeptical) of the experimenter should have no impact on its outcome. This method works quite well when working with the physical world and achieving results that are consdidered "scientifically proven." Even so, nothing is ever proven beyond any doubt - successful experiments only add credibility to a hyptothesis over time. When do you know when to stop testing a hypothesis? Even if you have 1000 confirmations in a row, what do you do if experiment number 1001 shows a different result? Because of this, experiments have to be very carefully designed and hypotheses gain credibility and acceptance over time and repetition. Having experiments repeated and verified by disinterested third parties gives especially strong support for theories. When disconfirming evidence is found for a theory that was previously considered well-confirmed, it usually takes time to understand what has changed and why, and sometimes even longer to change scientists minds. The experimenters themselves are still human and suffer from the same confirmation bias as everyone else. Confirmation bias can be especially pronounced when reputations and professional esteem are on the line.

Occam's Razor

When it comes to things like reviewing historical events, it is difficult to apply the scientific method. In courts of law, different "standards of evidence" are used to evaluate information gathered about events after the fact. Historians don't have the same kind of rules as courts, but they do evaluate testimonies from different sources by comparing the things that match or don't match. They look at the person's vested interest in the event under consideration, and if their testimony matches other independent sources. From considering a variety of information a pattern emerges from which a "most likely scenario" can be discovered. Often times, historians (and scientists, too) use something called "Occam's Razor," otherwise known as "The Law of Parsimony." This law states: given multiple theories about a given event that are equally supported by the evidence, the simplest should be accepted as the most likely.

Here is a contrived example to hopefully illustrate this point. A friend comes to visit and have lunch and brings along his Labrador dog "Buddy" with him. Your friend thinks Buddy is the greatest, most well-behaved dog in the world, and he is indeed a nice dog. You make lunch of a couple of hamburgers and set them out on the counter. Before you sit down to eat, your friend asks you to come outside and see his new car, and you go out leaving Buddy sitting in the kitchen. When you return, one of the hamburgers is gone. Buddy sits looking as innocent as ever and you wonder what happened. You immediately assume that Buddy has helped himself to one of the burgers and you're a little irritated that you have to make another one so you can have lunch. However, your friend insists that Buddy would NEVER do such a thing, he's a well behaved dog, and has never done that in his home. Someone else must have come and taken the hamburger. Now you look for evidence. No one else is home. Your friend says the back door is unlocked, someone must have come in and taken it. Look, there's your neighbor mowing his lawn next door - he could have done it! Both theories are supported to some extent by the existing evidence. Which would you choose? The parsimonious theory is that Buddy ate it; rather obvious isn't it? But your friend may become angry and takes Buddy and leave, not willing to believe his dog was responsible.

While this may seem like a silly example, it demonstrates a parallel I have seen when people are debating historical situations and the people involved. From one point of view, it may be just so obvious that so-and-so did something, but someone else will defend person X to the extreme. If they could get over their confirmation bias that person X would never do such a thing, it would probably be obvious to them, too. But sometimes they just can't, and "obvious" to one person is all but obvious to someone else. Depending on their emotional investment in the position it just may be too embarrassing (they've publicly stated the other position numerous times), or personally painful ("person X was my great-great grandfather and a respectable man!") to publicly admit they were wrong. I don't "blame" them necessarily, but it does change how I accept their evaluation of the evidence.

When evaluating conflicting evidence, especially when it comes from eyewitness testimony, we need to sort out what is "possible" from what is "likely." There may be a million ways something COULD have happened (maybe an alien spaceship teleported the hamburger out of your kitchen), but what is the most likely? And the more coincidentally improbable things that have to happen for something to be true, the more the likelyhood that it is true goes down.

Can We Be "Certain?"

Another thing to consider is certainty vs probability. When defending his dog, the friend might insist on "proof" that the dog ate the hamburger. Without pumping the dog's stomach, there is no way to prove beyond any doubt that the dog ate it. (Even if examining the stomach contents, the friend might make the argument that he fed his dog some meat earlier, and that is what was found in his stomach.) An argument can always be made to defend a position, we need to consider the plausibility of the argument vs. the alternatives. Even the most accepted scientific principles, like gravity, are not "proven" with absolute certainty. Perhaps there are some situations under which gravity doesn't apply, such as in the sub-atomic quantum-mechanical realm. Sometimes we have to accept some degree of uncertainty and carry on with what we have. Uncertainty is a part of life, but it doesn't have to paralyze us. The more outlandish the claim, the more evidence should be required to support the claim - as Carl Sagan said, "extraordinary claims require extraordinary evidence." It shouldn't be our job to "prove" unreasonable claims untrue beyond any doubt. It should be up to the proponent to demonstrate with reasonable probability that their claim is in fact correct.

Why Do People Believe Wrong (or Weird) Things?

There are a number of reasons that we believe things that are not objectively true. In fact, I'm quite sure that everybody believes a number of things that are incorrect, and are not at all aware that they do. Some people are convinced they have been abducted and experimented upon by aliens. Others believe that crystals can be used to heal people of serious diseases. Still others believe Harry Potter-style magic actually exists. All of these beliefs are significantly outside the mainstream, but some people accept them without question. Why is that? One reason is that we simply accepted what we were told at an age before we really had the cognitive faculties to analyze the information. Since then we've had no reason to re-evaluate that initial proposition. Other times we make a mistake in attributing false causes to desirable outcomes. For example, many superstitions are based on false cause fallacies, such as baseball players insisting on wearing their "lucky socks" on game day. Others are more subtle, and not so easily proven or disproven. Sometimes we can even be almost willful in our denial of new information. This can be the result of not wanting to believe new information because it is emotionally painful for us. The concept of trying to maintain two contradictory ideas is called "cognitive dissonance."

A classic example of cognitive dissonance is a husband or wife confronted with information that their spouse is cheating on them. The wife, let's say, has to deal with two cognitions that are mutually exclusive: (1) her husband is a wonderful, faithful, family man that has never given her reason to question his fidelity, and (2) her friend's report of seeing him leave a hotel with another woman and passionately kissing her goodbye. These two ideas are completely at odds, and depending on the emotional pain she might experience by acknowledging #2 above, it may be rejected (even sub-consciously) or supressed. She may be very good at making up excuses about how such a thing could be possible. She may even become angry with her friend for making such an accusation. Maybe her friend was mistaken about his identity. Maybe he was at a business meeting, and it was just a peck on the cheek as many Europeans do when greeting friends. And so forth. Anything but face the awful truth of her situation, while her friend has no trouble recognizing the reality. Cognitive dissonance is the discomfort resulting from the clash of two contradictory ideas, and supression or rejection of one of them is usually the result. It's hard for people to live with two fully contradictory ideas, especially if it impacts them in a personal way.

You may wonder if the spouse in the situation above was just stupid. Probably not. In fact, there are many examples of very smart people that believe in ideas that very out of the mainstream, even "kooky." But it's been shown that even "smart" people often arrive at some sub-set of their beliefs for "non-smart" reasons (or at a young age) and are now quite good at defending that set of beliefs with their new set of intellectual capabilities. See "Why People Believe Weird Things" by Micheal Shermer for more interesting examples. And the stronger the external pressures to believe (such as loss of employment, or disruption of family life) the harder people will cling to their beliefs.




This is great stuff, fh451. Thanks for posting it here. I look forward to the future installments. I have been thinking a lot about the idea put forth by a Mormon apologist who has posted comments here to the effect that the only difference between a faithful Mormon who knows the history and a disaffected Mormon is that they are simply making different inferences about the facts and evidence. Of course, that is true, but the question is whether some inferences are more reasonable than others. Your example of how the hypothetical woman deals with the information about her cheating husband is a good one. She is making inferences, giving her husband the benefit of the doubt in every case. But while that may be reasonable with one piece of evidence (e.g., "I saw your husband at a motel with another woman"), which may be reasonable as additional facts come to light (e.g., "they were in the lobby. The woman looked like his sister. She had two kids that look just like your nephews."), it becomes less reasonable depending on other facts that come to light (e.g., "he was kissing a woman passionately as they left the room together.") In the latter case, the wife's decision to give her husband the benefit of the doubt (to apply positive rather than negative inferences to the facts presented) is less reasonable. At some point, giving her husband the benefit of the doubt by always drawing a positive inference becomes completely unreasonable (i.e., the other woman keeps calling; there is lipstick on his collar; unexplained time away from home; pictures of the other woman in his wallet; a DNA test showing that he has fathered a child by this other woman, etc.).

That's really the case with Joseph Smith. Sure you can give him the benefit of the doubt on some things and draw positive inferences on some things. But as the evidence piles up, it becomes extremely unreasonable to give him the benefit of the doubt on everything. At some point, the evidence ought to break through the dissonance. That it doesn't always is a testament to the power of religious belief.


It's a well-written, well-conceived letter. I think that anyone about to read Thornton Wilder's "The Bridge of San Luis Rey" should read it. The main point of the letter deals with applying scientific reasoning to religious belief. Anytime someone starts talking about that, I can't help but think of poor Brother Juniper and his 'experiments.'

I do find it interesting, however, that under the portion labeled knowledge you omitted any definition of what philosophers consider knowledge. The consensus view is that knowledge is true, justified belief. The physiological mechanism of obtaining knowledge, however advanced it may be, never tells us what knowledge actually is.

I also find it interesting that you employ the law of parsimony in evaluating religious beliefs. The problem with it is that is does not allow for spiritual or mystical experiences to be assigned any meaning or importance. I think you would be hard-pressed to find anyone that would apply it so cynically to mystical experiences. Besides, people in real life seem to be unconvinced by it. (See the O.J. Simpson criminal trial and acquittal; although to be fair, the LAPD did hurt the prosecution by being sloppy in their treatment of the crime scene.) It may have it uses in certain circumstances, but I think there is no basis to use it for every conceivable circumstance.

As far as the adultery and prophet analogy is concerned in Equality's post, I think it overlooks the importance of defining terms before looking at the evidence. To wit, it's easy to define what an adulterer is, but not so easy to define what a prophet is. I think that we can all agree that an adulterer is a person who is married to someone and has sexual relations with a person who is not his/her legally-married spouse. For the wife in the analogy to experience cognitive dissonance, she would have to find that the evidence pointed to her husband matching the description of an adulterer above. The greater the evidence, the more likely her husband is an adulterer. And Equality's point that you can only give the benefit of the doubt up to a certain level is correct.

But what if we're talking about prophets? If you go out onto the street and stop ten random individuals, ten out of ten of them would most likely agree to the definition of an adulterer above. If you asked those same ten people, "What is a prophet?", I very much doubt that you would have ten substantially similar definitions. What is the consensus view of what makes a person a prophet? Considering the vast differences in religious belief concerning prophets, is there a neutral, independent-of-a-given-religion definition? This leaves us in a quandary because the definition is the basis on which we identify the facts that are material or relevant to the determination of a prophet. In a free country like ours, there are no universal, exterior definitions of a religious prophet. It comes down to a personal choice and what I define as a prophet defines what evidence I will consider material in making a determination as to whether a particular person fits that mold. It may be 'kooky', 'creepy', 'mysterious' and 'spooky', but only because of public opinion and for no other reason. It is fallacious to argue that I am wrong simply because a majority of people think otherwise.

There can be no scientific definition of a prophet. It is a purely religious thing. Religious belief entails that a choice of some kind has been made. And that choice determines which inferences are reasonable or credible. If people don't agree with what you define as a prophet, they can't say that you are being stubborn or irrational; you just don't have common ground on which to operate because you haven't arrived at a definition of what you are talking about. Absent a consensus definition, any discussion would be meaningless. If I believe that a person is a prophet because they have a tattoo of a skull on their chest, who is to say that I am wrong in my religious beliefs? If I hear about someone who has a skull tattoo and they also happen to be a serial killer, I have a choice. I can change my definition (and I concede that outside influences play a major role in changing a given definition) or I may say, "Well, I never defined a prophet as a perfect person, I only ever said that they have skulls tattooed on their chest. Anything Prophet X does outside of his tattoo is not worthy of my attention and has no bearing on his divine calling." If I did change my beliefs based on someone else's reasoning, I would already have to be somewhat sympathetic to that viewpoint because there would be no common ground otherwise.

Mr. Moynihan was correct when he said that people can't choose the facts, but he omitted to mention that the opinion they choose determines which facts they consider meaningful and important.


I concur that the definition of a prophet is not as objective or standard as the definition for adulterer is. In this case I believe we are safe in holding Joseph Smith to his owns claims whether it be called prophet, seer, or conman. He claimed to have had a vision. He claimed to have been threatened by an angel with a sword if he did not conform to the practice of polygamy. He claimed that his followers would reap huge benefits if they invested in his "revealed" banking institution in Kirkland. He claimed to have translated ancient papyri written by Abraham.

So whatever definition you choose, he can be held to those claims.


Thanks, DPC, for your thoughtful comments. In saying that it's "interesting" that I omitted any mention of a definition of what philosophers consider to be knowledge, are you implying I did that on purpose to somehow avoid something that might undermine my thesis? Just asking, because I don't think I'm really capable of that kind of sophistry, frankly :-). First, I don't feel like I'm an expert in the field of epistomology and it is true that the question of "what do we really _know_ anyway?" is difficult and messy. I doubt I could really do it justice. Second, I didn't think it would add much to what I was trying to say to my target audience - but you did a good job supplying a succinct statement. Of course, one could argue (and many have, for thousands of years) exactly what one means by "true," and "justified" (or even "belief," for that matter :-)). In general, I think most people (especially teenagers) have some intuitive feel for what it is to "know" something, and I don't know if it's worthwhile to belabor the point at a philosophical level.

RAS already gave, IMO, a good response regarding the "prophet" definition. I really don't much care what the "man on the street" would say a prophet is; I'm really only interested in the LDS definition of prophet as I was taught growing up, and the version my children have likely been and are being taught. The expectation that I have of a theology is that it should at least be internally consistent, even if it is not supported by emperical evidence in the scientific sense. The LDS idea that the powers of heaven are irrevocably tied to personal worthiness is in direct conflict with one who claims to receive prophetic revelation, while at the same time behaving in a way that would, by the LDS definition, make one inelligable for such. Thus, my investigation into the behavior of Joseph Smith led to an inescapable (for me) conclusion: he could not be a prophet by his own definition.

The "law of parsimony" is only a law because we call it such; in reality, it is an idea that has been found to be useful in a variety of situations. There are circumstances in which the more complicated explanation for a phenomonon does, in fact, turn out to be correct. But it is only judged so in the light of additional experimentation or information that then makes the more complex explanation better match the evidence. All things being equal (which they rarely are) I would still favor the simpler explanation, even for mystical experiences. I'm not convinced by the O.J. Simpson analogy. The fact that the jury came to an erroneous conclusion in no way impugns the value of Occam's Razor. In fact, I would say that had they actually used Occam's Razor appropriately, justice would have been better served. But that is a completely seperate issue from mystical/spiritual experience. In my opinion, one has to accept the existince of a spiritual or mystical realm ex-nihilo as axiomatic, and proceed from there. Those that accept it, simply do and don't seem persuaded otherwise by the vagueries and sometimes contradictory conclusions that often results. If one desires to believe, then it is a simple matter to come up with a satisfactory (for them) explanation. In a subsequent installment I will give a few more examples of debating techniques that are often used in such situations, and what I consider to be the difference between faith and knowledge.


Eric Robeck

Excellent letter. I will probably be writing a similar letter someday, as I find myself in the same situation.

Thanks for your thoughts.

The comments to this entry are closed.