Tuesday, November 30, 2010

The 48 Percent Part 3 - Slipping Into the Future

Right now, the United States economy, and that of the world in general, is in turmoil–arguably the worst economic disaster since 1929. In my day job, working on the public side the workforce development arena (i.e. Job Service), I see many people that feel strangely out-of-place seeking new employment in the 21st century world, many of whom were with their former employer for decades.

Even my co-workers are not immune to this sense of bewilderment. The particular branch of my state government for whom I work is in the midst of transitioning, after nearly a decade using Windows XP® and MS Office XP/2003, to Windows 7/Office 2010. In speaking with my managers and co-workers, I have likened the experience of my co-workers starting up their computers one morning, only to be faced with a completely unfamiliar operating system, to what a blind person would likely experience if they awoke one morning to find their furniture has been rearranged while they slept.

One of my younger co-workers observed, as have I, that many job-seekers that claim to know nothing about computers act as though their ignorance is something of which they can be proud. I am quite willing to grant a pass to those of my parent’s generation (born 1939), those that are at or near retirement age. However, for those young enough to have not yet started elementary school when Sputnik 1 launched in 1957 (i.e. those born after, say 1952/3), there is, barring some notable exceptions that I will address shortly, no excuse.

My day-to-day supervisor thinks that with the rate computer technology is advancing, it will not be long before computers will be completely voice-activated, without the need for keyboards or pointing devices (mice, trackballs, etc.), with perhaps the exception of a finger. The sort of computer-human interaction depicted in Steven Spielberg’s film The Minority Report may not be far off the mark in this regard. History has shown however, that while the pace of technological change is in some regards even quicker than some “futurists” predicted, reliably predicting the direction of the changes is more of an art than a science. Case in point: from the vantage point of the 1950’s through the mid 1970’s, the real 21st century does not look much like the 21st most thought it would just several decades ago.

While the changes that have occurred have not been in the anticipated directions, they are every bit as momentous, if not more so, than the long-delayed flying cars in everyone’s garage, orbiting space colonies, and manned Mars bases. Just because we do not yet have flying cars for all, and are still waiting on the lunar vacation resorts, does not mean that the computer revolution of the last 15-20 years was a mere passing fad, as some plainly thought it was. Apparently, a significant number of our fellow citizens (I am almost exclusively concerned with Americans in this piece) were not paying attention.

Another thing that leaves me stupefied is when people act outraged or offended when pointing out that their lack of computer skills will preclude their opening certain occupational doors, as though it is completely unreasonable to expect them to put forth the required effort to acquire new skills in order to make themselves marketable to employers. I am not talking about “average” people knowing how to create their own webpage using html code, or build a relational database from scratch. What I am talking about is “average” folks being able to follow the instructions for filling out an online employment application or finding information about a potential employer by visiting their website.

It is now time for a few caveats and to crank down the “arrogant bastard” tone. Many people (I am referring to adults throughout) find computers intimidating. One oft-repeated frustration I have heard is that some find computer screens too busy; with just too many things to visually track. This produces stress and anxiety, which in turn makes it that much harder to focus and just becomes a debilitating positive feedback loop. Our technological society has created a stimulation-rich environment that did not exist 50+ years ago.

One result of our stimulation and information-rich society has been a marked increase in the identification of, and diagnoses for, cognitive impairments like ADD/ADHD and autism. One mundane reason for this increase is that the diagnostic tools used to detect such impairments have become much more refined as our understanding of how the brain works has grown. The other reasons are best explicated by way of analogy. Brain circuits involved in spoken language were certainly shaped by our evolution, but written language (and formalized mathematics as well) are technological innovations, not things for which the human brain was equipped by natural selection.

There remain a few isolated hunter-gatherer societies in the world today, living much the same way that all of humanity lived prior to the invention of settled agriculture and the first cities. Even after the invention of farming and cities, several more thousand years elapsed before writing was invented. In such pre-literate societies, whether today or in depths of human pre-history, the condition known today as dyslexia was irrelevant in the environment in which such people lived. Additionally, to the extent that “dyslexia” was irrelevant, it would be reasonable to say that, in some sense, “dyslexia” did not exist.

A quote, most often attributed to William James, but first made known to this author via Mr. Spock of Star Trek fame, summarizes the situation nicely, “a difference which makes no difference is no difference.” If a difference in brain wiring, known to lead to dyslexia in an environment in which mass literacy was the norm, were to occur in the brain of a person belonging to a pre-literate society, that difference in brain wiring would, in fact, “make no difference” in the life of the individual in their native culture. It is only in an environment where mass literacy is the norm does such a distinction in brain wiring make an important difference.

People today find themselves bombarded by external stimuli (mostly visual and auditory) that were not part of the environment in which Homo sapiens evolved. Navigating the modern world calls upon cognitive abilities that were seldom, if ever, called on just a generation or two ago. Is it any surprise to anyone, especially experts in the cognitive sciences, that the totally novel sensory and cognitive environment of the early 21st century is revealing heretofore unrecognized, and previously unneeded, strengths and weaknesses in the cognitive endowments of individuals.

One of the names this sort of “sensory overload” goes by is Sensory Processing Disorder (SPD), but there are other conditions that exhibit similar symptoms. Much of the relevant peer-reviewed literature available to this author have titles such as:

  • Sensory processing disorder: Any of a nurse practitioner's business?[i]
  • The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines[ii]
  • "Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity[iii]
  • Employment and Adults with Asperger Syndrome[iv]
  • Information-processing deficits and cognitive dysfunction in panic disorder[v]
  • Perspectives on sensory processing disorder: a call for translational research[vi]
  • The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking[vii]

ADD/ADHD provides a useful analog for this in that it was first identified in children and only later was it realized that the condition persists into adulthood. Many adults, myself included, developed coping strategies and were completely unaware the ADD/ADHD they had as a child was still with them until they were properly tested. It may be that because children are often placed into environments and situations they would rather not be in, that it is in these situations that their struggles stick out like the proverbial sore thumb and provide a “handle” on which to begin a scientific enquiry. Adults, on the other hand, have traditionally had far more control over what situations and environments they choose to be in and after many years of doing so, the avoidance often becomes unconscious. With the majority of research in this area focused on children, I fear that many adults are slipping through the cracks. Adults who, for whatever reason, avoid anything that is too intellectually or cognitively challenging will be at a disadvantage, both in terms of self-sufficiency and providing good role models for their children.

I suspect that the apparent difficulties many adults have in functioning in our current stimulation and information-rich environments are an admixture of genetic/physiological[viii] factors, socioeconomic/cultural factors, and personal choices. Groping for remedies to this situation, without having the vaguest notion of how much of a particular population’s variation in demonstrated cognitive capacity is attributable to which causal factors, is a recipe for failure. Such a “wait it out” attitude essentially admits that some sizable faction of human beings is somehow incapable of being educated or elevated in their competency and understanding. So why not give up right now?

I do not wish to live in a world where a certain percentage of humanity is relegated to a permanent intellectual underclassnot if it can be helped. Putting my “arrogant bastard” hat on again, I also know that many people are, in fact, ignorant, incurious, and complacent. A frequent frustration in my day job is that I am in a bit of a catch 22 in distinguishing those that have a legitimate learning/cognitive disability from those that are merely lazy and/or complacent. Either way, I risk doing a grave disservice to those I work with daily. I do not tolerate fools gladly, and far too often, I must bite my tongue and refrain from telling someone who just could not be bothered to keep up with our changing world, that unless they have a note from their doctor, I am not going to carry their lazy ass.



[i] Byrne, Mary W. "Sensory processing disorder: Any of a nurse practitioner's business?" Journal of the American Academy of Nurse Practitioners 21. 6 (2009): 314-21.

[ii] Eppler, Martin, and Jeanne Mengis. "The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines." Information Society 20. 5 (2004): 325-44.

[iii] Gupta, Reetika, Sucheta Nadkarni, and Stephen J. Gould. ""Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity." Advances in Consumer Research 32. 1 (2005): 42-50.

[iv] Hurlbutt, Karen, and Lynne Chalmers. "Employment and Adults with Asperger Syndrome." Focus on Autism & Other Developmental Disabilities 19. 4 (2004): 215-22.

[v] Ludewig, Stephan, et al. "Information-processing deficits and cognitive dysfunction in panic disorder." Journal of Psychiatry & Neuroscience 30. 1 (2005): 37-43.

[vi] Miller Lj Fau - Nielsen, Darci M., et al. "Perspectives on sensory processing disorder: a call for translational research." Frontiers in Integrative Neuroscience 3. 1662-5145 (Electronic) (2009).

[vii] Morelli, Frank, and Pamela A. Burton. "The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking." Military Psychology 21. 1 (2009): 81-97.

[viii] When I say “physiological,” I have in mind such things as the environment within the womb, nutrition, or any other biological effect that is not directly attributable to an individual’s genotype.

Saturday, September 4, 2010

The 48 Percent Part 2

At Beyond Belief 2006, when speaking about his work on phantom/paralyzed limbs and the denials that can accompany such phenomena, V.S. Ramachandran related a humorous anecdote about a study that asked people if they were above or below average in intelligence. Ramachandran pointed out the fact that like height, the distribution of IQ scores in a population take on the shape of the iconic “bell” curve (called by mathematicians a “normal” or “Gaussian” distribution). The salient property of Gaussian distributions of variations in a population is that 50% of the population will be below the average value (or arithmetic mean) for the trait in question and the other 50% of individuals in the population in question will be above the average value. [1] The punch line comes when Ramachandran reveals that 98% of the survey respondents indicated that they considered themselves to be of above average intelligence, a statistically impossible result which indicates that 48% of humanity are “in denial of their own stupidity.” His point was that even people without brain injury engage in classic Freudian, defensive denials every day.[2] Though the study may have been fictional, it is plain that only 50% of humanity can be of above average intelligence and therefore, the other 50% must fall below that average.


A consequence of the above dilemma showed up in an interview with Lord Martin Rees, Patricia Smith Churchland, A.C. Grayling conducted by Roger Bingham of The Science Network.
At about 00:32:00 into the dialog, Dr. Churchland notes that there is (primarily in the United States), and coming from both the far left and the extreme right ends of the cultural/political spectrum, a disturbing undercurrent of anti-intellectualism in general, and of anti-science in particular. She confessed that she does not know how to reach the sort of people who get their news from Rush Limbaugh and/or a certain American news channel that she left unnamed.


The brute fact that half of humanity will always fall above the normalized “average” intelligence (measured by whatever criteria one chooses) and the other half will fall below that “average,” poses a profound problem for skeptics, atheists, scientific rationalists, humanists, and anyone hoping to increase the role of evidence-based critical thinking in the discourse of our democratic republic. Proposed solutions usually involve some combination of better schools and/or teachers, more educational television programs, more popularizations by capable scientists or other public intellectuals, more scientifically accurate
Hollywood pictures, or better-trained science journalists. However, what if those things are only partial solutions? What if there is an asymptotic limit to the percentage of people that can be reached by reason and evidence? Amidst all the talk of science and math education in schools and efforts to engage the voting public, there is one question that has not been raised, let alone substantively addressed. What if some significant fraction of humanity is simply not cognitively equipped to think critically or rationally to the degree required to become a scientifically literate citizen in the 21st century?


No matter how “politically incorrect” the above question may seem, the question posed is very worth answering. Asking the question or attempts to answer it is not part of some sinister eugenics program or elitist, racist agenda. In
Breaking the Spell, Daniel Dennett advocated that all the tools in the arsenal of modern science be applied to understanding religious faith and practice. Dennett also maintained that religious faith is unique in that it is currently off limits to the kind of inquiry he was proposing, hence his choice of title. While it is certainly true that the “taboo” against looking too closely at religious beliefs for fear of dulling their sheen is probably the strongest such taboo, inquiries into other areas of human existence can also set off alarms in some people. It is quite likely that what follows, if taken up by experts in the relevant disciplines (I make no claim to be one), may be an important adjunct to the investigation Dennett proposed. If the question under consideration were “what percentage of human beings can reasonably be expected, based on the heritability of the required traits and assuming an environment that provides the opportunity to excel, would be able perform at the level of an Olympic athlete?”, the question would be entirely uncontroversial. That is precisely the kind of question posed (but not answered) in this series of essays.


As Steven Pinker argued so effectively in
The Blank Slate, human beings are not infinitely malleable. Relevant to the topic of these essays is the question of what exactly goes into making someone a skeptical, critical thinker (though not necessarily a scientist)? Obviously, “intelligence” or as it is colloquially called, IQ (after Intelligence Quotient)–an admittedly slippery term–is part of the picture, regardless of how “intelligence” is defined and/or measured. The openness and intellectual honesty demanded by rational inquiry, is essential to not only science, but to history, law, medicine, ethics, or any other field of human intellectual endeavor (not to mention the functioning of a healthy democracy), and is antithetical to any form of authoritarianism. The degree to which someone fits the authoritarian personality type certainly matters too. What is the nature/nurture split for authoritarianism? Likewise, curiosity and inquisitiveness are also essential to being an informed, rational citizen in the 21st century. However, there are hundreds millions of human beings in the United States alone, never mind the rest of the planet, who seem incurious and uninquisitive. To what degree are curiosity and inquisitiveness malleable or heritable? What areas of the brain light up in when someone that is asked to justify their rationale for thinking that evolution or anthropogenic climate change are preposterous ideas, and yet at the same time finds millennia-old miracle stories of virgin births, people rising from the dead, or nocturnal rides on flying horses, etc. to be completely credible and utterly reliable?

Some of the relevant research in all these areas has been conducted already, with the greatest amount devoted to the heritability of IQ. A limited amount of research has been performed on the heritability of authoritarian attitudes and very little research has apparently been done on the nature/nurture mix for things like critical thinking, tolerance for ambiguity, or curiosity and inquisitiveness. Nearly all attempts to engage the public to further science and reason seem to assume that a majority of those not already so inclined or engaged, can indeed be reached. Those convinced that the future of humanity critically depends on the application of science and reason to the problems that vex this planet would do well to test the assumptions underlying efforts to communicate science and reason in order to better direct their efforts. None of this should suggest in the slightest that if the number of people that can, in principle, be reached is below a certain minimal threshold, the effort is not worth it. Nonetheless, we need to have some idea of how successful we can reasonably expect to be, all other things being equal.



[1] Barring any “self-selection” biases of course.

[2] Ramachandran, Vilayanur S. Roger Bingham ed. Session 4. Beyond Belief: Science, Reason, Religion & Survival. Salk Institute. La Jolla, CA. November 5 2006. The Science Network. 23 August, 2009. (at 44:12).

Thursday, August 26, 2010

The 48 Percent Part 1

I am not a professional, credentialed, scientist – I am just a guy with a B.S. in Interdisciplinary Science (IS) with an emphasis on science communication and the public understanding of science. I do consider myself a serious amateur and in that context, I do what I can to be, in Carl Sagan’s memorable phrase, “a candle in the dark,” a voice for reason in our “demon-haunted world.” I deliberately switched majors to IS from Electrical Engineering because I was so deeply concerned about the lack of appreciation and understanding of what science and critical thinking are, even among very bright students in engineering programs.

My undergraduate thesis involved a planned NSF-funded “Deep Underground Science and Engineering Laboratory.”[1] The State of South Dakota, and especially the Governor’s office, made a big deal about how much the planned laboratory could do for science education and attracting high-tech jobsthose involving Science, Technology, Engineering, and Math (STEM for short)to the state. My thesis looked that these hopes in light of the realities “on the ground,” considering the fact that South Dakota is very conservativeboth politically and religiously. The seed of this research was planted by an incident involving the local YMCA. Rather than “reinventing the wheel”, I will quote from my final paper:

‘Part of the impetus for this Capstone project came from the opening of The Arts and Science Center at the Rapid City YMCA (YMCA ASC) in 2005. Initially, staff at the YMCA hoped to get students and faculty from the South Dakota School of Mines and Technology (SDSM&T) in Rapid City involved and the SDSM&T student Paleontology Club was especially enthused about the Center as they were to have a room showcasing dinosaurs. When plans for the room were discussed though, it turned out that the room was to consist of little more than colorful murals showing dinosaurs and people, together. As it was, a conservative, home-schooling, Christian mother was a primary financial benefactor of the Center and refused to have anything showing dinosaurs (or people) in their proper geological and evolutionary context. The student Paleontology Club refused to have anything to do with such an intellectually dishonest enterprise and the faculty of SDSM&T likewise has had nothing to do with the Arts and "Science" Center since this issue came to light. From first hand experience, this author also notes that the YMCA ASC also has an astronomy, or “outer space,” room that is completely devoid of any hint of the scale, in both time and space, of the cosmos.”



My research and writing unfolded over an 18-month period. The first task was to convince my readers that there was indeed a problem, so I looked at comparisons between the
United States and other first world nations in science literacy and academic achievement and as one might surmise, the United States did not fare well. My research also looked at Math and Science Partnerships (MSPs) which are, as the name suggests, joint ventures between k-12 schools, federal and university laboratories, and industry, aimed at creating the workforce the United States will need to compete and prosper in a 21st century world. Not surprisingly, the intended metric for the success of such endeavors was through standardized tests.


It was never my intent to answer specific questions, but to pose them. This was the approach that Daniel Dennett took in
Breaking the Spell: Religion as a Natural Phenomenon.[2] Asking my reader’s indulgence (hopefully) one last time, I will quote the final paragraph from my paper:

To truly prosper, as a free society and as individuals, it is not enough to merely do well on standardized tests. What is needed are citizens that do not fall for the idea that vaccines cause autism, that do not spend millions, if not billions, of their precious health-care dollars on homeopathic remedies that do not work, and parents that are not so certain of the “power of prayer” as an efficacious treatment for disease that they refuse conventional (i.e. double-blind tested and verified) medical treatment for their sick child. It is quite possible to believe all the things above, and still do well on standardized tests or write sophisticated software for a modern computer. This research, while in no sense conclusive, will hopefully encourage these important issues to be examined in the development of ongoing “Education and Outreach” strategies surrounding the Sanford Lab, and hopefully, the NSF’s DUSEL.


As my research drew to a close, I was bothered by some nagging questions. In my research, I made a point of discussing the probable relevance of the “Five Factor Model” of personality in describing what personality variables may go into fostering “scientific habits of mind.”
[3] The personality traits included in the FFM are:

· Openness to Experience: curious, creative, non-dogmatic
· Conscientiousness: self-disciplined, seeking to avoid error
· Extroversion: outgoing, assertive
· Agreeableness: generous, easygoing
· Neuroticism: anxious, critical of self and others


These traits are not binary qualities, like whether or not a male is circumcised, but are a continuum like height or weight (okay, maybe not weight so much, what with the epidemic of obesity). While the applicability of the FFM to understanding what goes into creating “scientific habits of mind” seems obvious, none of the literature I found gave even a hint of how these traits might be distributed in any population. This is unlike the robust empirical and statistical data showing the distribution of intelligence (or IQ) in a given population. This seems like a good place to stop for now and also a makes a nice segue for the next installment.



[1] "Sanford Laboratory at Homestake". South Dakota Science and Technology Authority. Last modified date not given. 25 August, 2010.< .

[2] To the best of my recollection, I was not intentionally aping Dennett because while I was aware of his book and knew something of what it was about, being a poor student and having purchased Richard Dawkins The God Delusion in hardback, I had to wait until the paperback version of Breaking the Spell came out. By time I could afford to by Dennett’s important book, I had already settled on my research questions and goals for that research.

[3] Shermer, Michael. The Borderlands of Science: Where Sense Meets Nonsense. New York, NY: Oxford University Press, 2001.

Tuesday, May 11, 2010

A principled postition on an "atheist/agnostic" for the Supreme Court

Note that the following has become moot for the most recent SCotUS vacancy...

On one of my frequent visits to Richard Dawkins' site I came across a discussion in the News section of an LA Times Op/Ed piece by Marc Cooper whose thesis was that Obama should consider nominating a religious non-believer (actually, Cooper used the word “atheist”) to the Supreme Court of the United States (SCotUS). While it would be nice to have a person of no faith as a member of the SCotUS, actively advocating for an atheist/agnostic as a nominee would constitute a transgression of Article VI, § III of the United States Constitution, which clearly says:

The Senators and Representatives before mentioned, and the Members of the several State Legislatures, and all executive and judicial Officers, both of the United States and of the several States, shall be bound by Oath or Affirmation, to support this Constitution; but no religious test shall ever be required as a qualification to any office or public trust under the United States.

I am fully aware that the religious lobby have been doing a slimy end-run around the “no religious test” clause by using the fears and prejudices of their voting constituencies to subject candidates to de facto (i.e. “unofficial”) religious test for decades. However much fun it would be to turn the tables on believers by having a religious test of our own (by "our own" I mean the skeptical/freethinking/reality-based community, and this includes even the most "wet behind the ears" atheists), in all intellectual honesty, the only correct answer to the question of what sort of "religious test" ought we make up, is “none.” Personally, I would rather stake out the principled Constitutional high ground on this issue. Further, empirical evidence supports the conclusion that religious believers would be utterly blind to the hypocrisy of their own position as any public statement that does not constitute "thunderous applause" for their position is, in fact, in their distorted world view, callous hostility; they seem to not recognize a "yeah, whatever" (insert whiny, nasal sound when you say "whatever") as a dismissal indicating that whatever was said is actually recognized as irrelevant.


I think Richard Dawkins phrase “conscious raising” is particularly applicable in this case. I would like to see voters, candidates, and the media become reflexively suspicious of candidates that trot out their piety in order to win votes. Likewise I would like to see fellow citizens, candidates, and the media heap scorn and derision upon voters (whether singly or in groups) that attempt to ascertain the religious opinions of candidates. I would love to see the day when in reply to a question put to a candidate about their religious sentiments, they answer with something like this: “The Framers of the Constitution knew well what kind of damage could be done to a country by inflaming religious prejudices and hatreds among its citizens. This is why Article VI, § III of the United States Constitution specifically forbids religious tests for any 'office or public trust.' What the framers likely did not anticipate was the extent to which citizens could be manipulated into creating an 'unofficial' test by unprincipled demagogues with an agenda that care not a whit for the Constitution. Merely asking a candidate or nominee for public office 'are you religious?' is a "religious test" and is antithetical to the ideals and values upon which our Constitution is based and is about the most un-patriotic, un-American thing I can imagine.”


I must confess to being less than sanguine about the possibility of such a day coming to pass in my lifetime (one cannot say that until you are over 45 btw). But as Alexander Pope wrote in 1733 in his Essay on Man, indeed, "hope springs eternal."

Saturday, April 17, 2010

PZ Myers, Phil Plait, and the Pope

Two blog(er)s that I greatly admire, Phil Plait of Bad Astronomy and PZ Myers of Pharyngula disagree over just what atheists/skeptics/non-believers/rationalists/free-thinkers (I will be using these somewhat interchangeably throughout the remainder of this piece) ought to be advocating with regards to the revelation (pun intended) that Pope Benedict XVI, while still the (presumably fallible...you can be sure that Bill Donohue will point that out before long) Cardinal Ratzinger, knew of, and was complicit in, the shuffling around of Catholic priests against whom substantive accusations of child sexual abuse had been leveled, in what plainly was, and is, an ongoing attempt to shield those priests (and those above them in the hierarchy of the Roman Catholic Church) from investigations by temporal and civil authorities.

There is even documentary evidence that indicates that the Church has been aware of the existence of pedophilic priests for decades. More documentary evidence has surfaced showing that during the 1990’s, when the scope and extent of the crimes of Catholic priests were just coming to light, the head of the Vatican's office of The Congregation for the Doctrine of the Faith (in the past this office was known as the Inquisition...and in the interest of brevity I will call it the Modern Inquisition from here on), headed by none other than Cardinal Joseph Ratzinger, used the supposed spiritual and moral authority of the Church to silence victims and their families. Documents obtained from the time that Cardinal Ratzinger was in charge of the Modern Inquisition plainly shows that his primary concern was with the Church's, and its priests, reputation - not the well-being of the children entrusted to their care by parents and other adults that, believing in the religious authority and supposed benevolence of the Church, trusted that the children would be safe from harm.


So far the arguments seem to be mostly about the supposed "harm" done to the "skeptical movement" by those non-believers (like Richard Dawkins, Christopher Hitchens, PZ Myers, and in my own small way, yours truly) calling for the Pope to be taken into custody by civil authorities should he make trips outside the jurisdiction of the Vatican. Another tack that the critics of those that feel an uncompromising stance is warranted, is that, in the long run, such a stance is harmful to the cause (and I use the term "cause" advisedly) of reason. As PZ puts it, the criticisms seem more about the tone of the more outspoken rationalists than the substance of their arguments.

As bloggers and as science communicators, I admire both PZ and Phil, but they are different people, with different ways of coming at this particular issue, but I have to go with PZ on this one. I am very much in favor of, as Sam Harris has called it, "conversational intolerance" towards peoples' religious beliefs. The Bad Astronomer seems to make a distinction between holders of traditional religious beliefs1 and, say, anti-vaccination morons, holocaust deniers, and moon landing conspiracy nut-jobs. I do not make such a distinction, and it seems, rightly in my view, that PZ does not either. American citizens are free to enter into, or stay away from, whatever houses of worship they feel most resonates with, or conflicts with, respectively, their individual conscience (for a more academic treatment of this see here - my analysis of Constitutional issues surrounding religion and the U.S. Military for a Constitutional Law class I took while at university).

If gangs of (truly) militant atheists started going around boarding up houses of worship under the cover of darkness or blocking access to them (the way believers attempt to block access to the clinics of doctors that provide abortions), I would proudly defend to my death the right of my fellow human beings to spout whatever supernatural nonsense they wish within the confines of their chosen house of worship. By the same token, I would proudly defend to my death, mine, my children's, or anyone else's right to question, examine, analyze, criticize, and otherwise publicly tear to shreds any publicly stated idea or assertion, religious or not.2 Once religious beliefs, insofar as they are statements purporting to be about the way the world/universe/reality actually works, the veracity of certain historical claims, or that high levels of popular religiosity are necessary to building a prosperous, safe, and just society here on earth, leave the sanctuary (both literally and figuratively) of a "house of worship" they are as open to criticism as any idea in the arts, science, history, or economics.

At the end of Phil's piece, he says we (meaning, I suppose, the "skeptical community") "need to be human and humane" if we wish to "change the hearts and minds of people". As social animals, one of our most precious possessions are our reputations as reliably rational (and the definition of "rational" can vary greatly) beings in our relations with others. What is wrong with idea of putting people on notice that professing absolute certainty in the truth of nonsensical religious beliefs will endanger their credibility? As Sam (now Dr., as he finally finished his Ph. D.) Harris and Richard Dawkins have said repeatedly, we seem to have no problem openly laughing at, and mocking, the beliefs of those that accept the reality of the Norse and Greek pantheons, astrology, "crystal healing," and dowsing. Why is it that there is one remaining superstition that is "off limits?" The beliefs mentioned above have been successfully marginalized, due to, in no small part, the application of heaping amounts of shame and embarrassment upon those that publicly profess their certainty of the truth of the above beliefs.

We inhabit a complex and frequently dangerous world in which we must often rely on others to (hopefully) reliably inform us of the dangers around us and how to avoid them. In practical terms, we cannot all be running around testing every wild mushroom or every colorful berry to see whether it is safe to eat or not; we have to rely on the fact that our neighbors’ information about such things is accurate and well-founded.

An example of what I am talking would be a conversation I overheard between two co-workers not too long ago. Both parties are very conservative and the one speaker is a devout Christian and generally a nice fellow, but the shrapnel from my exploding irony meter would have punched through an armor-plated Humvee at 200 meters. Here is basically what he said..."Yeah, I'll believe 'global warming' is real when I feel the rising sea lapping at my ankles." I remember thinking to myself "oh sure, you accept all of the miracle stories in the Bible as completely credible accounts of actual historical events, but the idea that nearly 7 billion human beings (and all the ancillary stuff of human culture...crops, livestock, etc.-a major part of the Earth's biomass, see this) could have a deleterious effect on the planet's climate...that is simply preposterous..." His credibility as a thinking human being is completely shot as far as I am concerned.

If a student, for whatever trivial reason, failed to do their math homework the previous night and was less than forthcoming in their failure the following day (okay, they outright lied and claimed to have done it), is subsequently called upon to work one of the problems on the board, in front of the whole class, I am quite willing to wager that most people would agree that the shame and embarrassment the student felt was a deserved consequence of their prevarication. This is a reason why people do not like to get caught with their intellectual pants around their ankles. There is not a single adult human being on the planet that is entitled to not being called "on the carpet" (as it was termed during my service in the U.S. Navy) when their faith is writing checks that the evidence and their intellect cannot cash.

Going back to Sam Harris' idea of "conversational intolerance," no one is under any obligation, whatsoever, to treat religious beliefs any differently than any other kind of woo-woo. Scorn and embarrassment are absolutely legitimate weapons in the fight against those that would endanger the future of humanity by clinging to the beliefs of humanity's infancy. Nor could I care less about the hurt feelings of Catholics in all this. I was raised a believer myself (though not a Catholic) and my shame and embarrassment at the many hypocritical moral and ethical failings of my fellow believers, both collectively and as individuals, is one of the reasons I turned my back on the faith I was brought up with. As a believer, I was raised to value honesty, forthrightness, and integrity, and found, much to my family's consternation, that I held everyone to those standards.

I think the "militant" non-believers are right on this issue. For me, and from what I can tell, many other non-believers, based on what they have said and written on this subject, this is a matter of principle. For far too long have "believers" coasted though their existence believing, due in no small part to the complicity of all of us, that they are nice, decent, moral people, all the while largely crediting their religious faith for their supposed "superior" morality, and using that supposed "superior" source of morality as a bludgeon against those that do not share their views. It is high time for all believers to be told, in no uncertain terms, that the core of their supposed "superior" morality is in fact, morally bankrupt.

1I must say in the BA's defense, that he spares no pity for parents that withhold proper medical care for an ill child because of their religious beliefs, though it must be admitted that most prominent conservative Christian sects still take their kids to the doctor when they are sick, it is only the most truly nut-ball cults that deny medical care to children for religious reasons.

2 The idea that our democratic ideals are somehow connected to “Judeo-Christian” traditions is laughable. If democratic ideals were inherent within Christianity or its predecessor, Judaism, why did it take almost two millennia for democracy to rise from the ashes of Periclean Athens and the Roman Republic? In fact, Christianity cursed the west with the idea of the “divine right” of kings to rule of which Europe had to divest itself of before realizing democratic ideals. There is as little foundation for the idea of a secular, democratic republic (which is what the United States is) in the Christian scriptures as there is for Newtonian physics.