Thursday, July 27, 2017

At Last, a Female Doctor Who

I’m not a frequent user of social media for the simple reason that I have adult ADHD—nor do I suffer fools gladly—so with all the idiots on social media, these ingredients come together to create a supermassive black hole ready to devour all my time. So I take a pass on social media. I did however look at web sites reporting on the social media reaction to the announcement that the new showrunner for Doctor Who (Who), previously the showrunner/creator of the crime drama Broadchurch, has cast Broadchurch actor Jodie Whittaker as the 13th Doctor—the first female to play the role on an ongoing basis. I am jazzed at the prospect, but it saddens me to read the vitriolic reactions of what I can only hope is a small minority of some self-identified Who fans. Unfortunately, the online very negative reactions of these supposed “fans”i to this news has much in common with the backlash from a sub-population of supposed Star Trek “fans” surrounding the casting and characterization choices (among other things) of those behind the new Star Trek: Discovery series, set to premier this fall. I am addressing the disturbing political and cultural zeitgeist in the US (mostly) at length in another series on this blog, but as these retrograde trends are spreading to those who call themselves fans of the two most cherished science fiction franchises on television, Star Trek and Doctor Who, I have some things to say. Given the long-standing ethos of Doctor Who and Star Trek, one might be excused for being shocked by how much overlap there is between Whovians and Trekkers and Trump/BREXIT supporters, but fact that such people seem to exist is an indication how far we have yet to go.
My first exposure to science fiction was through Star Trek in the form of Star Trek: The Animated Series (TAS).ii I had always been interested in space, and even in grade school I would lay outside on clear, warm summer nights with Dad’s binoculars, a flashlight, and books about the stars and planets. In junior high, I made the happy discovery in the library of the paperback short-story adaptations of Star Trek: The Original Series (Star Trek: TOS), and TAS, by James Blish and Alan Dean Foster, respectively. My reaction was essentially “Wow! Space with stories!” Later, I found the robust moral compasses of the protagonists and the high-minded, optimistic vision it offered of humanity’s future resonated strongly with the ideals of the religious faith I was raised in, without the disturbing end-times eschatology.iii
As I got older, I became aware of the iconic status of The Doctor, but I only got into Doctor Who after the 2005 “reboot” and was quickly sucked in—and for the record, I have never owned a Trek prop or uniform, but I do have a set of Who pub glasses and the sonic screwdrivers of Doctors’ 10 and 11, and I plan on getting the 13th Doctor’s sonic as soon as they are available. Like the Trek franchise, Doctor Who has never shied away from challenging viewers to look at things from a different perspective and aspire to nobler ideals. Another way the character of The Doctor resonated with me is they have never suffered fools gladlyiv—probably because there is no room across the whole of space and time The Doctor can enter where they are not the smartest person therev—even though others in the room dismiss The Doctor as a daft old coot or a younger, oddly-dressed A female Doctor is entirely plausible from an “in-universe” perspective as other Time Lords have changed gender as a result of regeneration.vii All Doctors have had to work through the choices made by their previous selves—the best recent example is in the Doctor Who 50th Anniversary Special “The Day of The Doctor,” where the 10th was the “man who regrets” and the 11th was “the man who forgets” (or tried to) how they thought they ended the Time War. Throwing in a change in gender along with the other changes regeneration routinely brings makes for meaty storytelling and acting challenges with immense potential.viii
As for the historically ignorant, loudmouthed puddingbrains complaining of some liberal, SJW “diversity” agenda being crammed down their throats by a shadowy cabal of liberal activists, the unfortunate reality is the progressive idealism embodied by franchises like Doctor Who and Star Trek left space dock on 23 November, 1963 and then again on September 8, 1966 respectively. On November 22nd, 1968 the Star Trek episode “Plato’s Stepchildren” aired showing the first black/white kiss on prime-time American television—a mere seven months after Dr. Martin Luther King Jr. was assassinated—nor was the controversial nature of the scene and its timing lost on the writers, director, or the cast. Also in the original Trek was the Vulcan philosophy of “Infinite Diversity in Infinite Combination” which was presented as noble ideal worth aspiring to. In the run-up to the premier of Star Trek: Voyager in January 1995, a small, but loud and obnoxious, minority of Trek fans were utterly apoplectic at the idea of a female starship captain headlining a Trek series for the first time. Were these past plot/storytelling choices instances of some SJW agenda being pushed down viewers’ throats 20 to 40 years ago as well? From the apparent mindset of many of those objecting to the 13th Doctor’s gender or the casting/plot of Discovery, it seems likely they would have thought so—at least if they’d been old enough to remember. It also raises the question of why the hell anyone with attitudes like theirs even watched Doctor Who or Star Trek in the first place, let alone those that claim to have been “fans” for decades? Do they suffer from some sort of cognitive impairment or does a subconscious cognitive dissonance usually filter out the bits that conflict with their contemptible world-view?
Thoughtful, intelligent science fiction in general has an established history of challenging the conventions, prejudices, and preconceptions of the audience/readers of which their authors, creators, producers, actors, etc., can justly be proud of. Science fiction at its best tells stories that expand our minds, enlarge our horizons, and challenge us to reassess our perceptions, both of ourselves and others. This is how both Doctor Who and Star Trek have lasted over 50 years, by speaking to the consciences of their viewers/fans and, with varying degrees of subtlety, raising our consciousness, individually and collectively, to the possibility of our being better tomorrow than we were today.
I also have a further point to make for those on both sides of the new, female Doctor Who debate. Given the half of all human beings born female, it is frankly preposterous, after the nearly 55 years of the franchise’s existence, for the title character to never, ever take on the form of a human female following a regeneration. However, I have encountered a small subset of the overwhelming positive online reactions supportive of the new Doctor, which then went on to express an apparently sincere wish to see future regenerations of The Doctor be representative of their specific, minority community. And just to be clear, I am not referring to the sarcastic comments along the lines of: “Next they’re going to have the Doctor be a transgendered, left-handed amputee with dyslexia.” After the past 50+ years of Doctor Who, a Doctor resembling the 50% of humanity that are our mothers, daughters, sisters, and nieces, etc., is not PC pandering. However, the world has given us Trump and Brexit, so I am less skeptical than I was of the ludicrously, ridiculously improbable actually coming to pass, so there may well SJWs advocating a sort of “scorched earth” agenda which more sane, liberal, broad-minded human beings might mistake as headlines from The Onion.ix
One of the primary marks of our shared humanity is our passion for stories. The reason storytellers tell the stories they do are as unique and individual as the storytellers themselves. Passively absorbing stories is easy, but engaging our higher cognitive faculties to think about the stories we hear, and maybe even learn something from them is much more difficult because doing that requires some minimal amount of courage, a willingness to challenge our own thinking. More difficult still is creating and telling such stories and doing it well. Aided and abetted by the Internet, it is fast becoming a cliché for small subsets of fans to protest loudly whenever their favorite franchise goes in a direction they disapprove of, and this seems to be especially true of science fiction and other fantastical genres. Some feel there is way too little “diversity,” while others think the creators have sold out to the SJW agenda—though I doubt “fans” at either pole could coherently explain what such an agenda might be, even if their lives depended on it. However much the fans of any franchise might feel personally invested in it, they don’t own it. Since the first glimmerings of spoken language arose in our Pleistocene ancestors—perhaps even before—stories helped us understand and explain the world around us, including our own existence. We share those explanations and understandings—the good, the bad, and/or how it might be better—with others so we might understand them and they us. Whether it was the first Sumerian scribe to record their elaborations on the already ancient oral Epic of Gilgamesh on cuneiform tablets, a lone 21st century self-published author like Andy Weir, or the hundreds of people making creative contributions to TV and movie franchise, they are entitled to tell the stories they want to tell, the way they want to tell them. If anyone in the audience is rubbed the wrong way by the stories of others, thinking all’s right with the world as is, then they can either read/listen/watch other stories. Or better yet, create the sort of stories they want to hear.
Among the many reasons J.R.R. Tolkien (and yes, I’ve read nearly all of his stuff too) gave for writing The Lord of the Rings was for the challenge of telling a long, complex tale that would engage readers and perhaps even move or inspire them. Online, Whovians of both sexes have commented about how, from an “in-universe” standpoint, Whittaker’s Doctor will have to break through entrenched gender stereotypes to be taken seriously before she can get on with her real work of saving puddingbrains throughout the cosmos. I noted earlier how, from the very beginning, part of The Doctor’s shtick was being initially dismissed—sometimes as a harmless annoyance, other times as an incompetent, dangerous, unstable element in the crisis du jourespecially by those who ought to take The Doctor seriously. The best example of this is, of course, Doctors 4 and 6 (Tom Baker and Colin Baker, respectively), I mean, just look at those outfits! Since 1963, The Doctor has been an archetype for a protagonist other story characters will initially dismiss as a nutcase.
The challenge Chibnall has set for himself and the other writers for the 13th Doctor is how they maintain continuity with the essential characteristics of all the previous Doctors—not suffering fools gladly and knowing, whatever room she (going forward) enters, she’ll be automatically the smartest occupant—with the new twist of the character’s in-story quirky or eccentric sartorial tastes might not be the only reason The Doctor is not taken seriously at first. For Ms. Whittaker, the challenge will be to make her Doctor believable in terms of the long back-story of the character, yet subtly informed by the long-overdue change in perspective being perceived, superficially at least, as a human female affords. Having seen the first series of Broadchurch, I’m sure she’ll be brilliant.
When still a religious believer, I was saddened by the idea that no one would ever know if someday, a united, peaceful (at least amongst ourselves), curious, courageous, compassionate, and adventurous humanity would have spread outward to the stars—because of the whole “rapture” thing. Now, years later, as an atheist and a rationalist, I am still enough of a romantic idealist to find it bittersweet indeed that I personally will never know how the story of humankind will turn out, but my younger self’s hope for humanity’s future remains. Yet even among self-described fans of the two most hopeful, forward-thinking science fiction franchises ever, it looks as though there may indeed be substantial numbers of our fellow human beings we will never be able to convince that such a future is worth striving for. What we will do then I do not know.

iMy use of scare quotes around “fan/s” is intended to suggest such fans might not have a clue about the broad ethos of the shows of which they claim to be “fans.”
iiI have only the vaguest recollection of seeing the Apollo 11 landing on television on July 20, 1969, about a month before my 5th birthday. The original run of Star Trek ended the month before the landing in June, 1969.
iiiIronically, as an adult, it was my own moral revulsion at my fellow Christians who cared more about valuing the “correct” beliefs than they did about believing in, and living by, the ideals they claim to value.
ivUnless it was part of The Doctor’s overall tactics/strategy.
vBear with me, I’m trying to avoid using gendered pronouns and still be somewhat readable.
viThat aspect of the Doctor’s character—at least as they had been written and played by white guys—appealed to me because after being diagnosed as “hyperactive” at the age of five (in 1970), in 2007 I was diagnosed as having adult ADHD—and with an IQ well above the 99th percentile. From the age of 5 to the age of 42, I always had the feeling everyone else knew or understood something that always seemed to elude me, only to learn the reason I always felt somewhat out-of-place, was that every time I entered a room, I was quite likely one of the brightest people in it—feeling out-of-place as a result.
viiOr maybe “regenderation”? My typo might have been a Freudian slip. LoL
viiiIn the end the 12th Doctor never knew he had gotten through to Missy, though he felt he had come close. Perhaps the 13th Doctor’s gender was due a subconscious desire to understand why he thought he was ultimately not able to get through to Missy. If so, I call dibs on the credit for any story ideas arising from this.
ixPerhaps there is the equivalent of “Poe’s Law” for the more extreme forms of feminism—along with may other “isms.”

Wednesday, June 28, 2017

The Economics of the Moral Compass

Why do some people consider it a profound moral wrong for government or individuals to legally recognize a gay or lesbian marriage while others feel, just as profoundly, that to deny the legal right and benefits of traditional marriage to a deeply-committed gay or lesbian couple is a violation of basic human rights, simple human decency, and a moral wrong? How is it that we can come to such diametrically opposed answers to so many of the same moral questions? When we describe something as being “morally wrong,” exactly who (or what) is being wronged? Is a path through the minefield of the deeply-held beliefs of our fellow human beings even possible? My intent is not to tell anyone what to think, but I do intend to present some ways of thinking about the problems and specific elements that must be part of whatever answers we may come to.
Two modern sciences are especially relevant in trying to answer the questions posed above, both of which have roots that arguably go back to the Ancient Greeks and, during the Enlightenment, were called moral philosophy and political economy. In hindsight, it is not a coincidence that the Scottish philosopher, Adam Smith, wrote the seminal works that essentially founded the disciplines that today we call moral psychology and economics, The Theory of Moral Sentiments in 1759 and An Inquiry into the Nature and Causes of the Wealth of Nations in 1776. Just how tightly intertwined moral issues are with economic issues in modern societies is easily seen in the annual battle over the federal budget in the U.S. where economic and budgetary priorities are often dictated by the moral/ethical priorities of differing constituencies or pressure groups. The issue of sex education in public schools is a good example; do we spend the money on "abstinence only" programs or do we spend it on programs that aim to prevent unwanted teen pregnancies and the spread of STDs. Where we, both individually and collectively, come down on that one is determined largely by our moral reasoning around human sexuality. I will get to such issues in time, but to make my point, I must first discuss a sub-field of both economics (a social science) and psychology (a behavioral science) that has only been around since the 1960s called behavioral economics.
Economists going back to Adam Smith assumed that economic actors–business owners, consumers, investors, etc.–were rational actors pursuing their individual aims in the objectively most efficient way possible. Subsequent generations of economists made this assumption explicit by defining "man" in a strictly economic context as an agent that seeks only to maximize his store of wealth by the most effective and rational means possible. As national economies grew and became ever more inter-connected throughout the 19th century, the amount of economic data available to economists grew by leaps and bounds. In parallel with this, advanced analytic mathematical techniques were developed that allowed the creation of mathematical models of economic systems where all actors–buyers, sellers, producers, suppliers, etc.–have the same information as the other parties and, in that context, act rationally in pursuit of their individual ends. The quality or robustness of any mathematical model of a system, whether of the gravitationally-bound Earth-Moon system, the climate, or an economy is judged by comparing the performance of the model against what has happened in the past, will happen in the future, in controlled experiments, or a combination of all of these. As economists in the first half of the 20th century applied ever more sophisticated models to the wealth of economic data available to them, the abstractions of human beings as rational actors grew increasingly untenable as their models often diverged from what it seems “economic actors” actually do in the real world.
Of the nations that fought on either side in World War II, not only was the United States the only one come out of it with their economy intact, but it flourished beyond anyone's most optimistic imaginings in the years and decades afterwards. In this economic boom, the number of roughly equivalent types of consumer goods, e.g. vacuums, toasters, televisions, radios, automobiles, etc., that people could choose from grew faster than consumers could keep up with. It was essential that manufacturers and retailers—and the advertising and marketing firms they retained—understood how consumers think and how best to pitch their products so consumers would choose theirs over the available alternatives. Over the last 50+ years, advertising and marketing firms have taken advantage of insights into human decision-making and cognition from the behavioral sciences.
One reason the rational actor model fails is that most of our decisions, even the big ones, like one’s college major or who to marry1, are the result of a heuristic process, such as common sense, rules of thumb, and educated guesses in which our emotions, rather than any sort of rational analysis. Usually, heuristic processes are not rationally coherent, but they do have the advantage of a much lower cognitive load—and are often the most emotionally palatable in that moment. In other cases, as shown in a study published in the Journal of Consumer Research in 20002, the authors describe an experiment that sought to find out if customers’ choices of “free” beer samples in a college town brewpub were influenced by others in their party. In the first variation of the experiment (I’ll call it “A”), customers verbally indicated their choices within the hearing of others at the table. Then, after having a chance to taste their selection, they were presented with a card asking them what their selection was and what they thought of it. In the next variation (“B”), customers indicated their order on a card—essentially a secret ballot, then, just as in the first iteration, customers were asked to rate their enjoyment of their choice.
When the results of both variations of the experiment were tallied, researchers found that when everyone else at the table could hear their order, few, if any, chose the same beer as any of the others. However, in the “secret ballot” variation, there was a great deal more clustering of customers choices. When the scientists looked at customers’ post-tasting feedback, for round “A” they found that those whose orders were among the last to be taken were less enthusiastic with their “choice.” In round B, when each customer was unaware of what others ordered they had a much higher opinion of their choice. The researchers noted that these findings were exactly what one would expect to find when behaviors and choices are dominated by a need for uniquenessi, and in this context is called “Consumers' Need for Uniqueness (CNFU)”—and yes, in the world of consumer research it really is a “thing.”ii Our culture places a high value on individual uniqueness and independence of mind and in social situations—especially when we are hoping to impress others with our independence of mind—this need can often outweigh our own preferences.
Some critics have made the accusation that much of the recent research in the cognitive and behavioral sciences—and by extension, the researchers—is aiding and abetting in the cynical—I’ll be blunt—manipulation of consumer-citizensiii (i.e. us), especially in the application of such research to the field of marketing.iv While I personally do not subscribe to such conspiracy theories, researchers in social psychology and behavioral economics are often attached to universities’ schools of Management and/or Business, but I think Hanlon's razor3 more than adequately accounts for this. It is also worth mentioning that a substantial share of current research—and thus researchers—are focused on helping professionals of all kinds, as well as consumer-citizens, make better decisions by pointing out the blind spots, biases, and fallacies we are all prone to. Even though we are not the rational economic actors that economists since Adam Smith believed we are, all is not lost. Irrational we may be, but our irrationality is not random, it falls into discernible patterns that we as individuals can be aware of, and empowered by the revelations of behavioral economics—and with practice and self-discipline—we can learn to make better economic and financial decisions and choices.
Next time, as promised, we will segue to the field of moral psychology and see how our moral reasoning can go awry.

Works Cited

1. Ariely, D. ‘Ch. 1-The Truth about Relativity’ Predict. Irrational. p.10; (Harper: New York, NY, 2008).
2. Ariely, D. & Levav, J. ‘Sequential Choice in Group Settings: Taking the Road Less Traveled and Less Enjoyed’ J. Consum. Res. 27, p.279–290; (Dec. 2000). at <>23 Jun. 2017
3. ‘Hanlon’s Razor’ Wikipedia. (27 Apr. 2017). at <>27 Jun. 2017


i Individuals’ “need for uniqueness” is a well-validated aspect of human psychology (see: The scenario of a woman being distressed at the thought of showing up to a party in the same dress as someone else may be more than a mere comedic cliché.


iii Apparently, the hyphenated word “consumer–citizens” raises the hackles of some folks, but the reason I decided to go with it is that so often, the same biases, fallacies, and blind spots that affect our behavior as consumers are the same ones that influence our politics. A question that has arisen in the last 35 or so years that illustrates this commonality of the two spheres of modern life is: “Why so many working-class, less well-off folks vote Republican, against their own economic self-interest?”

iv See:, p.276-7

Initially started at, then look for Robert McChesney.

Tuesday, February 7, 2017

My Moral Compass-Pt 2-Calibration


Before I start throwing around words like “moral” and “conscience” even more than I already have, I need to unpack what I mean—and just as importantly, do not mean—when I use them.

I often use “moral compass” and “conscience” interchangeably, but whichever term one favors, it is the entirely natural, materialistic, neurological and cognitive products of our evolutionary history as social animals. Like nearly every other natural trait, it is highly variable, with individuals falling somewhere along a spectrum of variation. That variability also makes it possible for the conscience/moral sense to be shaped by “nurture,” i.e. our culture and social environments—sometimes in positive ways, and at other times, not so much.i

It is quite likely that even before religion became “organized” some 12,000 years ago it figured out how to hijack an adherent’s moral sense and using the sense’s natural malleability to manipulate and control its adherents. The monotheistic faiths have since refined this manipulation into an art form. We see this today when the faithful go around condemning, for instance, what two consenting adults do in the privacy of their own bedrooms as immoral, and often for no other reason than the fears of the faithful that they may become collateral damage when God rains down fire and brimstone on the Godless. A good litmus test for what are and are not legitimate moral issues is whether they can be reduced to something like:

If we allow people to do X, or permit them to avoid doing Y, then God will be displeased and then “bad things” might happen and people might get hurt.”ii

Allowing fear, anger, anxiety, and distrust to dictate our actions is an abject capitulation to the darker side of human nature and an active repudiation of the highest moral and ethical aspirations of the human conscience, or in the words with which President Lincoln closed his First Inaugural Address: “...the better angels of our nature.”

My Moral Compass-Pt 2-Calibration

However much it may have felt like it when the 50 + percent of the Americans that cast their presidential ballots for former Secretary of State Hillary Clinton (me among them) woke up on November 9th, we were not the victims of a terrible transporter accident—à la Star Trek—suddenly stranded in a dystopian mirror universe where the Axis powers won WWII or something equally bonkers. When the electoral college met on December 19th, 2016, Clinton’s popular vote exceeded that of Donald Trump’s by 2.86 million—nonetheless, on January 20th, 2017 Donald J. Trump and his running mate, Michael (Mike) R. Pence, were sworn in as the 45th President and the 48th Vice President, respectively. Yes, it really happened, and it will be part of the new reality that we, and the rest of the planet, will need to deal with for perhaps generations to come. The cultural, political, religious, demographic, and economic forces that converged to enable Trump’s occupying the Oval Office go back at least decades, and some can be traced all the way to the Colonial era.

I had originally planned to make the argument that the vacuous depths to which our civil and political discourse would sink in the 2016 elections was presaged in no small degree by the Rush Limbaugh–Sandra Fluke controversy, but something happened in my own life over the Christmas holiday that I think is a far more powerful vignette to use in making my point than a re-hash of news items from over four years ago.

Since my unexpected return to Rapid City, South Dakota over 10 years ago (damn, has it been that long…?), I have accompanied my parents to Christmas Eve services at one of the two churches they attend. Fortunately, I have yet to burst into flame or have my head rotate through 360º as I entered. I love Christmas and I am not above being moved by the standard Christmas narrative, and like Linus Van Pelt, I can still quote from memory the Gospel of Luke, Chapter 2, verses 8-16. Then again, I am also moved to tears every time I read, and thanks to Sir Peter Jackson, watch, Sam carry Frodo those last agonizing steps to the summit of Mount Doom—demonstrating that the mere fact that one finds a narrative profoundly moving has no bearing at all on the historicity of the events described in the narrative. One of the most enjoyable Christmas Eve services I have been to was one that was structured around the historical back-stories of some of the best loved Christmas carols—and this quite naturally appealed to my inner history buff.

So it happened that on the evening of December 24th, 2016, I was at Christ Church in Rapid City, South Dakota—the same church where, several years before, I had so enjoyed learning things I did not know about some of my favorite carols. My mother has a very high opinion of the pastor, Richard Wells, whom she describes as being a “very learned man.” This year’s Christmas Eve message was titled “This Baby Changed the World.” The pastor opened by noting the variety of calendar systems around the world, many of which we now associate with one religious tradition or another. All of which is true—but, I am saddened to have to report that it was downhill from there. He related how the calendar used in republican Rome dated years from “the founding of the city”—which is certainly true, but when he began mocking pre-Imperial Rome for this, I had to wonder if this “learned man” has actually ever read I or II Kings (or the other “historical” books of the Old Testament) because they are chock-a-block with references to such-and-such an event having occurred “in the Nth year of the reign of king X.”iii

Next, adopting a conspiratorial tone, he teased his audience by promising to reveal the true motivations behind the reforms of the Roman calendar began by Julius Caesar and continued by his successors to sync the civil calendar of what was by then the Roman Empire to the seasons—which is why it later came to be called the Julian Calendar. According to Pastor Wells, the eminent practicality of having a calendar that accurately tracked the seasons, providing a much needed uniformity when keeping far-flung Roman Empire fed, was merely a smokescreen. As far as this “learned man” was concerned, the “real” reason for the Julian calendar reforms was that it was an elaborate plot by Satan himself to distract a fallen humanity from the advent of the incarnation of God on Earth™, Jesus of Nazareth®. Perhaps feeling he was on a roll, he then started mocking the current Dalai Lama, Tibetan Buddhism, and generally, any religion that was not his sort of Christianity. When recounting his visit to the Dalai Lama’s website, the “learned man” spoke in contemptuous, sarcastic tones of the emphasis in Tibetan Buddhism on virtues such as “compassion,” “understanding,” “forbearance,” “peace,” and “humility.” Dripping with sneering contempt, his voice rising to a crescendo, he spread his arms, exaggeratedly beseeching his flock to answer the question posed by what he took to be the current Dalai Lama’s essential message, “Can’t we all just get along?” Without giving the congregants time to digest his conspiratorial ravings, he answered his own rhetorical question with a resounding “No!

Back when I still considered myself a believer, I would have been surprised at this because (as I believed at the time) we are God’s children and so—saved or not—virtues like “compassion,” “understanding,” etc. were written in the hearts of all of humanity by God and that heeding the call of those virtues was a first step on the road to salvation, and as such should be encouraged. Instead, this “learned” man cranked the wheel and exited the moral high road, warning his audience that all this talk of “compassion,” “empathy,” and the relief of human suffering, etc. was in truth an elaborate trap set by Satan himself to lead Christians astray.iv A cold foreboding came over me when I realized that no nuanced exceptions were forthcoming. For the remainder of the service, the only thing preventing me from standing up and walking out was my love and affection for my mother and the desire not to see her shamed and embarrassed publicly.

Even when I was a serious, sincere Christian, I would have found anyone’s advocacy of such sentiments morally contemptible. In my teens, there came a point when I became aware of facts some might use to cast doubt on the truth of the Gospel, such as the fact that the law Code of Hammurabi long predates that of the Ten Commandments, or that the Golden Rule was known to ancient civilizations and cultures long before the first versions of it appeared in the Old Testament, let alone the New. The apologia I constructed to account for the universality of the human conscience, especially in cultures that predate Judaism and Christianity all those years ago, applies with no less moral force to what I heard on Christmas Eve 2016 than when this line of thought first occurred to me some 35 years ago and runs something like this:

The very fact that all human beings have an innate, universal response to, a longing for, things like mercy, justice, fairness—and are repelled by cruelty and the mistreatment and abuse of those unable to defend themselvesv—is precisely because we are all God’s creations, and while we are born as sinners, each one of us has a conscience, bestowed on us by God, that is no less a part of our humanity than is our sinful nature. Without an innate conscience to initially guide one towards the light of salvation, how could anyone ever recognize it when they found it? As Christians, we are called to be lights for Christ in this world, but if, as Christians, our words and conduct are abhorrent to the consciences’ of those we seek to “save,” then it is really we that are in need of being “saved.”

In writing the above, I adopted the voice of the Christian I once was, yet even as the atheist I am today, I would make essentially the same the argument—that irrespective of claims made regarding the relative merits of religious faiths or ethical, philosophic, and political systems—to the degree that they are contrary to the highest, most noble aspirations of the untainted, innate human conscience, they are an abomination.

There is a common thread, running through all of human history, from the Atlantic slave trade of the 15th through 19th centuries, the slaughter of indigenous populations by European colonists, the concentration camps and gas chambers of Nazi Germany, to the “ethnic cleansing” that plagued Eastern Europe following the collapse of the Soviet Bloc in the early 1990’s. One of the first steps leading to these horrors—and many others not mentioned—is when one group becomes convinced that members of another group are somehow not human enough to be legitimate recipients of the proddings of one’s conscience.

In further essays in this series, we will examine some possible specifics of how this came about and what we might be able to do about it.

iThe ability of most primates (including us) to pick up an object with our opposable-thumb-equipped hand and throw it is an evolved biological trait, but doing it well—as in consistently hitting one’s target—is a skill that has to be learned. So while just about anyone will improve with practice, that does not mean we can all perform such a task well enough to make the majors. Likewise, it has long been accepted that the human ability to acquire and use (spoken or signed) language is a biological trait that has evolved. Additionally, evidence for the evolution of a number sense in humans—and other species—is rapidly growing.

Though our scientific understanding of how the human moral sense evolved is yet in its infancy, it is clearly tied to the sort of altruistic, pro-social behaviors seen in other social mammals from bonobos (pygmy chimpanzees), African bush elephants, to dolphins.

As spoken languages became sufficiently sophisticated in our ancestors, our interactions with each other were able to become increasingly complex and nuanced. Just as rules of grammar and identifiable parts of speech—nouns, verbs, subjects, objects, etc. appeared and were eventually codified for each language—so too would rules for how to maintain the person-to-person and group-to-group relationships upon which our survival depended, appear.

iiI almost laughed out loud when I re-read this because it reminded me of that Mafia movie cliché: “Ya know, it’s a nice place ya got here. It’d be a real shame if sumthin’ happened to it, eh, Knuckles?”

Actually, the idea that organized religion shares aspects of organized-crime protection rackets kind of makes sense and as a quick Google search revealed, I’m not the first to see the similarity.

iiiSee, for example, 1 Kings 16:23: “In the thirty-first year of Asa king of Judah, Omri became king of Israel, and he reigned twelve years, six of them in Tirzah.” (NIV)

ivIn many ways, evangelical/fundamentalist Christianity is the most elaborate and widely-believed conspiracy theory ever cooked up by the human mind.

vYeah, what about those bits in the Bible about taking care of widows and orphans, eh?

Sunday, November 20, 2016

My Moral Compass-Pt 1-Origins

Introductory Note:

Though I am now an atheisti, I was not always so. Until my late teens, I was a very sincere, devout, Evangelical Christian, and my younger self took that faith very seriously indeed. In describing my thinking and reasoning of that younger self, I have endeavored to treat it with all the seriousness I did at that time. My use of capitalizations and symbols such as “TM” and the registered trademark symbol comes, it is fair to say, from my 2016 self. My reason for doing this, is to make the point that my younger self’s (okay, I kind of feel like I’m trying to write a Doctor Who episode) understanding of things like God’s will and/or what the most important parts of Christianity did not, in the end, match that of the Christians among whom I spent my formative years.

Forging My Moral Compass-Pt 1-Origins

Raised in an Evangelical Christian home, from an early age I was taught that it is wrong to bear false witness, and that what is, or is not, actually true, really matters. God, I was also taught, is a just God, and it was our duty to follow His example and strive for justice here on Earth. Being a thoughtful, reflective youth, I reasoned that since God was omnibenevolent, the source of all that is good in creation, it must follow that the highest, most virtuous ideals to which humanity can aspire were placed in the human heart and mind by God. While no one can ever fully achieve those aspirations, to honestly and humbly strive to do what we can, each of us according to our own lights, is surely the most ennobling and edifying journey a human being can undertake. Considering myself a Truly Sincere Christian Believer (TSCB) at the time, to do otherwise seemed to me to be contrary to God's Will®. I also thought, quite reasonably in my view, that the depth and sincerity of others’ Christian commitments would be manifested in their words and deeds too.
Alongside my religious upbringing, I was also a bright, inquisitive kid with a profound curiosity about the world around me, and was especially interested in science and history. I would lay outside at night in a sleeping bag, Dad’s binoculars, and books about the stars and just gaze in wonder for hours. I went through phases where I believed in ghosts, Bigfoot, and UFOs, but deep down inside I was always a skeptical, critical thinker—and yes, I never did get a straight answer on just were Cain found a wife after being marked and banished for the murder of his brother, Abel. By the time I reached my teens, my insatiable curiosity led to my being an avid readermostly science, ancient civilizations, lost cities, etc.,..and fiction—with a speed, vocabulary, and comprehension typically three or four grade levels above my classmates at school.ii
In my early teens, and still a committed, believing Christian, I had begun to see things I found troubling. Growing up, I remember singing songs like:
Jesus loves the little children,
All the children of the world.
Red and yellow, black and white,
All are precious in His sight,
Jesus loves the little children of the world.
I took these words to heart and was, quite understandably, very disturbed when the son of our church’s pastor, someone I considered a role model, casually referred to Native Americans as “rezzers” and African Americans as “darkies.” I distinctly remember thinking, “What happened to ‘Red and yellow, black or white, They are precious in His sight?’” At the time, I was profoundly troubled by this, but I thought it must be me. Perhaps there was some incredible epiphany others had been graced with that I had not. Lacking any “herd instinct,” I was never the sort to get into fights under the bleachers at school football games, nor did I really get all the hostility between different faiths, but I did believe that the Christianity I knew had the “truth,” or near enough so, that someday, the apparent contradictions between what I observed and what I believed would be resolved on the basis that my religion was “true.”
In high school, my peers and I were constantly reminded of all the temptations “out there” in the godless, sinful world and how important it was to resist them. The music we listened to, the people we associated with, the words we used, the activities we engaged in, what we stood for, what we took stands against—it was impressed upon us that our choices can either redound to the credit of our Christian Witness©, or they can fatally undermine it. Obviously then, our words and actions have downstream consequences, not only for ourselves, but for others. Having learned to loathe hypocrisy, I considered it incumbent on anyone that called themselves a “Christian” to proactively consider possible unintended consequences of what they say and do, because like a math teacher, God would insist that we show our work.
We were also cautioned to avoid the company of the “wrong crowd” lest our walk with Jesus” suffer and/or by presenting an un-Christ-like example, we might recklessly endanger unsaved souls by acting in ways which discredited not only our Christian Witness©, but that of Christians generally. As a genuine TSCB, I had no wish to be responsible anyone's soul being damned for all eternity. Being idealistic“idealistic” in the sense that the wider, secular world would recognizeI wanted to be one of the “good guys”and In my youthful naiveté, I thought doing both would not cause any conflicts—after all, if Christians were to be shining lights of “goodness” in a sinful world, it was obvious that the “goodness” had to be the sort that would be recognizable to make a positive impression on the “unsaved.”
Now, looking back 40 or so years, I realize that my Christianity influenced my idealism a great deal less than I assumed at the time. I now think it had more to do with the good guys/bad guys, heroes vs villains narratives depicted on TV, moviesiii, and in books. Personal experience also contributed to my idealism. I was always small for my age, and was bullied off and on in elementary and junior high school. Eventually I learned to defend myself and found that once you stand up to most bullies, being lazy and not very bright, they sought out only the easiest targets—at least in the late 1970s and early '80siv. Having been the victim of bullying myself, I felt empathy for others so victimized and stuck up for others being bullied.v The fact that I never saw any of my peers that thought of themselves as Good Christian Teens® come to the defense of someone being bullied only added to my growing disappointment.
When I left home for boot camp after graduating high schoolvi and became more aware of current events and the history behind them, the disconnects between what Christians as whole said they believed in and what they actually did grew glaringly obvious—and morally repugnant. One of the largest fault lines ruptured in my early 20s—in the mid-1980s there were several high-profile stories about anti-abortion activists that had conspired to violently attack abortion clinics and in some cases, actually murder abortion providers. My personal moral compass led me to expect Christian leaders to thunderously denounce such acts from every pulpit in the land because somewhere in my moral development, I picked up the odd little notion that there are some ends that can never justify the means undertaken to achieve them. Instead, the only voice I heard was my own—accompanied only by a chorus of crickets. That “moderate” Christians, as individuals, congregations, and denominations, across the nation, kept their silence, and refusing to condemn, in no uncertain terms such violence in the name of God, Jesus, or whatever, was (and is) morally contemptible.vii By that point I had already arrived at what one might describe as an agnostic Deism. But from that point on, any remaining shreds of moral or ethical credibility Christianity might have had, leftovers from my childhood, were gone, forever.
Amazingly, after over a quarter century since realizing I was an atheist, and after having spoken and written about my journey so many times before, it was only in the course of writing this that I had something of an epiphany—a bit ironic, given the original meaning of “epiphany” of an experience of the divine. For as long as I can remember, what I thought made Christianity the "true" religion was that at its core was an abiding belief in, and a commitment to live by, a set of valuesthe very values that had so powerfully resonated with the proddings of my own innate conscience, values and virtues that felt self-evidently right. The "Christianity" that in the end I rejected, seemed to care more about valuing the right beliefs. This pattern of starting out believing in the right values, only for it to flip, whether by design or not, into valuing the right beliefs, will be a recurring theme throughout this series.
Below are but a sample—not an exhaustive catalog—of the values and virtues I feel obligated to uphold. Some of them had their genesis in my Christian upbringing, while other did not. Regardless of the context in which I became conscious of them, I have striven to adhere to them.
  1. Empathy
  2. Justice—Just as an example, I accept that there may be some crimes, so heinous, that the guilty party’s life ought to be forfeit. On the basis of Old Testament scripture, many protestant denominations do in fact argue in favor of capitol punishment. For the sake of argument, let us suppose they are correct in doing so and that executing those found guilty of certain crimes is God’s Will®. However, if God is just, then it would be utter folly indeed to assume that He would not demand those that claim to be acting in His Name®, take great care that they execute the right person! For a Christian in the 21st century to brush off the 340+ people wrongly convicted—20 of them on death row—many exonerated by DNA evidence, is an unconscionable act of bearing false witness.1 If the God they believe in truly exists, they should all start praying now that he have mercy on their souls.
  3. Humility—including a tolerance of ambiguity, or in the words of Oliver Cromwell, Lord Protector of the Commonwealth of England, Scotland, and Ireland during the English Civil Wars of 1642–1651—"I beseech you, in the bowels of Christ, think is it possible you may be mistaken."
  4. Belief, without understanding, is little more than prideful ignorance
  5. Ends do not always justify means, and if someone tries to convince you otherwise, run!
  6. Chattel slavery—one human being “owning” another, in the sense that we own our car or our pets—is, was, and always will be wrongviii
  7. Physical, intellectual, and moral courage are all of a piece
  8. Do not demand from others what you do not demand from yourself
  9. Question everything—even those things others say you should not—and you’ll fall for nothing
  10. Be skeptical of any person, institution, or ideology that attempts to appeal to our fears—of others, the new, the different, the unknown, and the unfamiliar—as they have only their best interests in mind, and no one else (more on this later)
My native human conscience, when applied to the Evangelical Christian environment I was raised in, is what finally compelled me to leave religion behind and eventually, reject supernatural beliefs altogether. In Part 2, I will examine how we, as a nation, became more concerned with valuing the right beliefs—often utterly uncoupled from empirical facts and/or evidencethan with believing, and living by, the right values, and how these failures fed into the existential crisis the United States found itself in during the 2016 presidential campaign that ended with the election of Donald J. Trump as the 45th President of the United State (POTUS).

1. ‘Innocence Project-Cases’. Innocence Proj. at < >20 Nov. 2016

iAs Richard Dawkins has observed, it is nonsensical, and even harmful, to say that someone was “born a X”; where X is a religious faith—the only real exception is “Jew,” which would then look like: “She was born a Jew, but is now non-practicing.” I would, in a tongue-and-cheek way, be happy to agree with “Mark was born a pantheist...” because I was new, and everything looked and felt supernatural.
iiIronically, by the time I was in high school, I was in the adult Sunday School classes at my church, and it was not infrequently that someone suggested I should consider going into the ministry.
iiiFortunately, my parents did not forbid my siblings and I going to movies as the parents of some of the kids I went to church with did.
ivNow parents have to worry that the bullies might be carrying a gun.
vThis is an area where my religious upbringing did make a lasting contribution to my personal moral compass by reinforcing my loathing of hypocrisy. As a young teen, I noticed that too many groups/communities that were persecuted or suffered discrimination in the past did not get the moral of their own stories, later failing to defend other groups suffering similar injustices. The Plymouth and Massachusetts Bay colonies were established by groups fleeing religious persecution in England but upon arriving in the New World, showed no compunctions about treating dissenters in their own midst exactly as they were treated before leaving England.
viI did not expect to spend 20 years in the military at the time, and I’m still somewhat astounded that I lasted that long.
viiThe phenomenon of more moderate, less extreme religious believers, failing to rebuke the extremists in their ranks is found to a much greater extent in Islam, but that fact in no way, shape of form lets Christianity off the hook. Put it this way, how incredibly impressive would it be if Christianity openly and publicly demonstrates that it takes its duty to extirpate the cancer of extremism from within its own body of believers, while ”moderate” Muslims do nothing about their own much bigger problem?
viiiAs a student of ancient history, I have to acknowledge that “slavery,” broadly construed, has been ubiquitous throughout history, likely since before the dawn of agriculture and the first cities. Explaining something however, is not the same as excusing it. In Rome (both the Republic and the Empire), slaves were often the only member of a household that could read and write, earn their own money, and even buy their own freedom. In fact, a freed Roman slave, while not considered a Roman citizen, entitled to what we today would call “due process,” the children of a freed Roman slave were considered citizens. As any Christian ought to know—and shame on any that do not know—being able to assert one’s rights as a citizen of Rome came in awfully handy to the Apostle Paul.

In the brutal, inhumane, chattel slavery in the United States that was ended only after the Civil War and the passage of the 13th Amendment to the Constitution, it was forbidden to teach slaves to read or write, nor was there any way for a slave to earn money and buy their own freedom.