Tuesday, December 9, 2014

Merely “Leisure” Activities?

I am often surprised by the stuff I know that others seem totally ignorant of. It is not as though I deliberately set out to stuff my head with trivia to impress people around me, or, as I have more than once been accused of, make them feel stupid. Even when not suspected of being deliberately uncharitable in my opinions of my fellows, others feel obliged to remind me that not everyone is “interested” in the same things I am, and usually revolve around doing something useful with a computer.
We all have the right to allocate such resources as we can afford–in terms of time, money, and energy–to the leisure activities of our own choosing or inclination–provided they are legal, of course. In developed countries, at least for those not mired in poverty (generational or otherwise) or burdened by a cognitive disability, there are a wide range of leisure pursuits available to them. Those that whine about being “bored” or that there is nothing that “interests” them immediately to hand, it bears pointing out that it is those with boring, unimaginative minds that are often the most easily bored; in short they are not trying hard enough.
A few weeks ago I was talking to a co-worker that recently purchased a Windows 8 computer and they complained about the absence of the familiar “Start” menu. I empathized, telling him that when I was shopping for a new laptop shortly after Win 8 debuted it took me all of 10 minutes on Google to find a workaround so I could get back to a Win 7-type desktop, which I too preferred. He responded (I’m paraphrasing), “Well, you just 'get' stuff like that.” I tried to point out that it is the exact same skill set one used to look up information on a particular topic using the index in a book or using an old-fashioned library card catalog, but his reaction gave me the distinct impression that wrapping his head around what I had said would have required more thought than he cared to, or could be bothered to, invest. Much of this tension arises from the idea, wide-spread in our modern culture, that having to figure something out, to think, or to use the ability to reason in order to reach a goal is somehow optional in the same way that knowing how to crochet is optional.
In our 21st-century society, some have gotten it into their heads that expecting supposedly competent adults, at least occasionally, to step outside their pathetic little comfort zones and exert themselves mentally, cognitively, or intellectually so they can get to wherever it is they want to go, or accomplish, in life is somehow a form of bullying. Such a position could not be more wrong, or more dangerous. As an adult with a clinical diagnosisi of ADD, staying organized, focused, and on-task is something that in no way, shape, or form comes naturally to me. Given my diagnosis, according to experts that assist those with disabilities in finding and keeping a job, I should avoid desk-bound jobs that require good time-management, record-keeping, and organizational skills. The problem is, that is exactly the job I have. Every day I have to discipline myself to do things that do not come naturally to me or that were once outside my personal comfort zone, like keeping good case notes on my clients and staying on top of my schedule. At this point, a reader might be forgiven for thinking I am a really arrogant bastard; however, they may be surprised to learn that I hate to say “no” when someone asks for my help with something, even when it means dropping what I'm doing. It has been a struggle learning to “no” when working on something that cannot easily set aside and pick up again later.
Homo sapiens (modern humans), and the plants and animals we raise for food, have become the dominant terrestrial form of life (excluding insect-sized critters and smaller) on the planet. The reason for this is (or ought to be) obvious; our ability to reshape the environment around us. In modern, developed societies, the environment is dominated not by the climate, or the local geography, but the culture(s) in which we live. Our brains, and their associated wiring, largely determine what our natural inclinations are and this powerfully influences what hobbies, interests, and leisure activities we choose from among what is available in the culture(s) in which we are embedded. Among the consequences of this interplay of genes and culture are the friends we choose, what we choose to talk about, the music we listen to, the books we read (or don't read), the movies and television shows we watch, the foods we come to enjoy, and a great many other things.
This bi-directional interaction of our genes and the environments we tend to gravitate towards (due to our genetic predispositions), when played out over the human lifetime can, and does, have profound effects on intellectual development. In the lead up to the October 2012 annual meeting of the Society for Neuroscience, the 5 October issue of the journal Science was devoted to highlighting 'Mysteries of the Brain.' In that issue, a news article titled ‘Why Are You and Your Brain Unique?' looked at, among other things, what we do and do not know about intelligence. The scientific consensus is that in young children, roughly 20% of the variation in intelligence is due to heredity (this is where twin studies are particularly useful1 ).1 (p.693) Surprisingly, research studies involving older adults have found that the heritability of intelligence in that particular cohort can approach 80%.1 (p.695),2 (p.35) One possible account of how the heritability of intelligence can go from 20% in young children to 80% in older adults is that what is heritable may not be raw computing power, but rather proclivities towards certain behaviors and tendencies to seek out certain environments.2 (p.35),3 (p.86–6) In the Science piece cited above, behavioral geneticist Robert Plomin summed it up by asking, “Do you read books and talk to people who make you think more, or do you lobotomize yourself with television?”2 (p.35)
While a ten-year old's preference for spending a rainy day exploring their local public library as opposed to sequestering themselves in their room playing mindless video games may seem pretty trivial, over time such behavioral tendencies can exert a powerful influence on an individual's intellectual prowess. Plomin willingly concedes that testing that explanation experimentally may be difficult, but neither is it impossible.2 (p.35) The takeaway here is that as children, the playing fieldin terms of any “edge” one might have courtesy of their genesis much more level than many have supposed. Flipping those percentages around, 80% of the variation in intelligence seen in children is not attributable to genetic hard-wiring but to environmental factorsand that includes what leisure activities they choose to seek out, so in a sense, it is very much a question of use it or loose it. Personally, I have serious misgivings about the prospect of genetically engineering super-smart humans, and fortunately, for now, we do not know how to. I do not want to live on The Planet of the Morons either, so what can be done?
The influences of organisms' inherited traits when played out against the local environment is the essence of natural selection and this is just as true of our species as it is of all the other species on the planet. Perhaps the best example of a trait that is mostly genetic is height, the recipe for which, like intelligence, is made up of many different genes. Any human population, in a particular place and at a particular time, say white, adult males born in the United States between 1960 and 1969, will have an average height. A portion of the deviation from that average height (in either direction) of any one individual in that population will be due to the specific combination of genes they have inherited, and the remainder will be due to environmental factors, of which there are also many, and include things like an individual's medical history. The greatest impediment to reaching the maximum height allowed by an individual's genes is malnutrition prior to reaching sexual maturity. World-wide, the heritability of height is consistently observed to be between 65% to 80%. 4 The lesson is that if a population has its basic nutritional needs met, the average height of that population is not very environmentally malleable.
In contrast with height, the fact that only 20% of the variation in intelligence seen in children is heritable should mean that intelligence is far more susceptible to a society's efforts to maximize it by creating environments in which the intellectual development of children can flourish. The fact that by late adulthood the heritability of intelligence approaches 80% clearly shows the epic scope of our failure as a culture to avail ourselves of the robust malleability of intelligence seen in children.ii Instead, we have created a society and culture where people feel entitled to not have to think very hard, or well, about anything and become indignant when suffering the trauma of the cognitive equivalent of a hangnail or the need to apply a little mental elbow grease to accomplish a goal. We have convinced ourselves that it is an affront to “human dignity” to require of ourselves, and others, to think clearly, master a particular skill that does not have an obvious, immediate use, or present a cogent, evidence-based argument, or defend against such an argument made by another party.
It just so happens that a substantial portion of the leisure activities I find most rewarding and enjoyable are those that engage my seemingly endless curiosity, challenge me intellectually, and expand my understanding– and appreciation–of the universe we collectively inhabit and of our place within it. Having observed my fellow human beings over the course of my adult life, I can honestly say that a majority of them seem to avoid anything even remotely resembling the sort of things I enjoy like one might avoid an Ebola-ravaged African village–and I gladly concede that others' avoidance of such things is a right they are entitled to exercise. Though my teenaged self could not have predicted it, looking back over the intervening decades it is obvious that my interests, and hence the leisure activities I chose, have had the side effect of making me much better informed, better prepared, and more competent at navigating the increasingly complex and rapidly changing world of the 21st-century.
Reaching any goal worth achieving or realizing any meaningful growth as human beings is impossible without stepping outside our comfort zone; if where each of us wants to be always lay within our comfort zones then we would already be where we want to be. Pretending that anyone is entitled to not having to step outside their comfort zones, to never feel frustrated, to never have to struggle to learn something new or unfamiliar, or never face disquieting facts on the way to attaining their goals would be, quite simply, a lieor if one prefers, an act of bearing false witness.
The broader implications of bearing false witness in matters of the intellect, to ourselves and others, will be the subject of my next post.

References
1. Deary, I. J., Spinath, F. M. & Bates, T. C. ‘Genetics of Intelligence’. Eur. J. Hum. Genet.. 14, p.690–700; (2006).
2. Miller, G. ‘Why Are You and Your Brain Unique?’. Science. 338, p.35–36; (5 Oct. 2012). doi:10.1126/science.338.6103.35
3. Neisser, U. & Boodoo, G. ‘Intelligence: Knowns and Unknowns’. Am. Psychol.. 51, p.77; (Feb. 1996).
4. Lai, C.-Q. ‘How Much of Human Height Is Genetic and How Much Is due to Nutrition?’. Sci. Am.. (11 Dec. 2006). at <http://www.scientificamerican.com/article/how-much-of-human-height/> accessed on: 19 Nov. 2014

i At least once a week I hear someone say something like “I am so ADD.” My response is to point out that unless they have been diagnosed my a licensed psychologist in a clinical setting, it doesn't count, and to describe themselves as having ADD or ADHD is to misrepresent themselves to others...period.
 
ii Imagine the outcry if, after children in a particular population were weaned (at the age of 3 to 5 years), the heritability of their height was observed to be only 20%, but after reaching sexual maturity, the heritability of height came in at 80%. Their individual genetic makeup remains the same, so we would be scrambling to identify, and remedy, whatever environmental factors were stunting the growth of so many children. Given that it is our cognitive faculties, far more so than our stature, that are essential to thriving in a modern, technological society, why is there not a similar outcry over the wasted potential of so many young minds?

Thursday, November 28, 2013

Happy Thanksgiving, 2013

Happy Thanksgiving to All!
Whatever our personal beliefs, no one is an island‒we all depend on the kindness, generosity, hard work, and sacrifices of our fellow human beings. If a neighbor foolishly fails to thoroughly read the directions on their new turkey fryer and sets fire to their house, but the fire is extinguished before it burns down their house (and possibly spreads to yours), and you feel a need to thank (insert preferred higher power), go ahead‒after all, reciting the right words in the right order demands very little us in the way of thought, reflection, or labor.
However, (you just knew this was coming) a far more sincere, tangible, and morally praiseworthy way of expressing one's heartfelt gratitude would be to do something for the firefighters and other first responders that gave up their holiday to protect the lives, safety, and property of their fellow human beings. Bring a meal to the station house, or ask if there are any that are without families nearby that would be alone and invite them to your house for the following holiday. If you have a loved one in a hospital or nursing home during the holidays, don't just say some words to them, actually do something for them. Pick up the phone and make some calls, get some names of those caring for your loved one during the holidays instead of spending time with their loved ones. Bake them some cookies, send them a fruit basket, or whatever, but do something.
Many ranchers here in South Dakota are still reeling from the effects of a devastating blizzard in early October and appeals for help in aiding ranchers that have, in some cases, lost well over half of their herds, has been phenomenal. Most (non-vegan) folks will be sitting down to turkey dinners rather than beef today, so remember those that are, this Thanksgiving Day, working to bring to fruition next year's harvest, often laboring for long hours in crappy weather, even on national and religious holidays.
So on this day of Thanksgiving, remembering that actions speak louder then words, do not neglect to say “Thank You” for those that labor, if even indirectly, on our behalf...because we can all agree that they do exist.
P.S.
Just for the record, I hate the word “Turkey Day” because, while I will be having some later this afternoon, I much prefer a holiday ham.

Thursday, October 10, 2013

Just What Does the Far Right Not Understand?

Well, here we are again, pawns in yet another game of “chicken” that puts the economic well-being of the United States of America at risk. The current situation is the result of many things, but I want to point out the complicity of my fellow citizens, because without their ignorance and intellectual laziness, we might not be in the mess we are in.

What galls me the most, is how many people do not know that the Affordable Care Act (ACA‒a.k.a. "ObamaCare"–a label that sounds like it was made up by a 7 y/o playground bully–which seems about right given the apparent cognitive capacities of the right-wing rank-and-file) is already a law! It was passed by both houses of Congress and signed into law on 23 March, 2010 by President Obama. It then withstood a challenge that went all the way to the Supreme Court of the United States (ScotUS)! Check it out yourself–there may be a quiz later. The coup de grâce of the whole thing is after all the wrangling, all the far-right rhetoric, all the "tea party" protests, and the legal challenges...the President that spearheaded the push for the law was re-elected in a campaign against an opponent who swore they would immediately repeal the law if elected!

Why is it that so many on the Right do not get that? Did they miss some episodes of Schoolhouse Rock!? The rights of citizens to share their opinions with others until they are blue in the face, which as a matter of principle, I would give my life to defend, in no way, shape, or form, means that the content of their opinion(s) is entitled to anyone’s respect, independent of the merits of said opinion(s). Students of all ages, from elementary school to grad students, are expected to turn in their assigned homework, but the teacher grades the work on its merits alone, and the same principle applies in the marketplace of ideas.  

Citizenship in a democratic republic is serious business and if such a nation is to endure, it demands that its citizens do their homework before opening their mouths, pulling the lever, punching a chad, or blackening in a box!. The Framers knew that the only way our young republic would thrive was to have an educated, informed electorate. For the Framers, the bloody English Civil Wars of the 17th century were recent history and they acknowledged that human nature had a darker side, where passions frequently trumped reason, which is why they designed our system of government with the system of checks and balances they did.

Early efforts to ensure the ideal of an “informed” electorate led to things like requirements that one be a white, land-owning male‒which, however well-intended such requirements were to begin with, they were soon used to systematically dis-enfranchise, by law, whole classes of citizens‒women and African-Americans in the Jim Crow South‒to name just two such groups. As a nation, our collective moral compass (at least for most of us) learned to reject such things as antithetical to the ideal of a participatory democracy. There are, however, steps we can, and must, take in our everyday interactions with others to minimize the damage caused by baloney, propaganda, and outright deception. We do not impose legal sanctions on people picking their nose in public, but we don't need them because the embarrassment people feel upon learning that others think them an uncouth, gross, disgusting boor for doing so is sufficient to quickly cure most people of the habit while still adolescents. Similarly, “civil discourse” does not mean giving someone spouting patently false nonsense a pass out of concern for their feelings, nor does it mean that we throw them in irons send them to a dungeon for being idiots. The “civil” in civil discourse hearkens back to the (albeit idealized by us today) age of the ancient Greek agora and the Roman forum, where citizens engaged in economic activities and discussed and debated matters affecting the polis, and its Latin equivalent, civitas‒what we would call today the citizen body.1 (p.204)

As individuals and citizens, we must realize, and remind others when necessary, that in any discussion, debate, or outright argument, we must not only respect the rights of others to speak their mind, we must also defend our right not to have our time wasted. If our fellow citizens, elected officials, and media talking-heads demand that their right to be heard is respected, we, as the “audience,” have an equal right to demand of those laying claim to our time and attention that they do their homework and not insult and disrespect their audience by wasting their time.

Thomas Jefferson wanted his gravestone to note the three achievements of which he was most proud, The Declaration of Independence, the founding of the University of Virginia‒the first University in the former colonies intended, from the ground up, to have no religious affiliation, and his authorship of the Virginia Statute for Religious Freedom. For this essay, the money quote is in the last paragraph of the Virginia Statute:

“...all men shall be free to profess, and by argument to maintain, their opinions in matters of Religion, and that the same shall in no wise diminish, enlarge or affect their civil capacities.2 (p.289–90) (emphasis mine)

The point is, the right to publicly air ideas, beliefs, and opinions carries with it a duty to defend those ideas, beliefs, and opinions. If one's constitution (or intellect) isn't up to the task of defending their deeply-held beliefs using argument and reason, there are places where one can talk about them with little fear of criticism...like churches and NRA conventions. The trick is to not let one's beliefs write checks that their intellect can't cash and having the courage to keep ourselves, and others, honest.

References

1. Price, S. R. F. & Thonemann, P. The Birth of Classical Europe: A History from Troy to Augustine. (Viking: New York, N.Y, 2011).

2. Jefferson, T. The life and selected writings of Thomas Jefferson: Including the Autobiography, the Declaration of Independence & His Public and Private Letters. Ed by. Adrian Koch & William Peden. (Modern Library Paperback: New York, 2004).

Monday, September 30, 2013

Intellectual Honesty, Atheism, and Faith

I was recently asked two questions by a long-time family friendwho also happens to be an ordained Assembly of God minister. One was how an “intellectually honest” atheist could deny the “historical fact” of the Resurrection of Jesus of Nazareth? The other was one atheists have heard (and answered) too many times before: “How is atheism not a faith too?”

Intellectual Honesty


There is nothing “intellectually honest,” at all, in asserting that any miraculous, supernatural phenomena is a “historical fact.” This is so obviously wrongand on so many levelsthat it was difficult to know where to begin. Here is a partial (and abbreviated) list of what is, and is not, intellectual honesty:
Intellectual honesty...

  • does not allow one to ignore evidence that goes against whatever it is that they want to be true (e.g. “So, Mr. President, what was it that made you think Saddam had all those WMDs in the first place?”)

  • is implacably opposed to compartmentalized thinking (the division of the Christian Bible into chapters and verses is a perfect way to encourage compartmentalized thinking and its handmaiden, hypocrisy)

  • requires that every link in a chain of reasoning must hold, without exception, no excuses, and no special pleading allowed

  • mandates that any attempt to sidestep, evade, or ignore these rules, by any party to a discussion, constitute sufficient grounds for forfeiture of any claim to be taken seriously 

 Doing My Homewor


As part of my homework, a word that appeared many times in my reply, I pointed out the contradictions in the narratives in the Synoptic Gospels (i.e. Matthew, Mark, and Luke) of the events leading up to the Crucifixion of Jesus of Nazareth (JoN). I also put together a table of the discrepancies found in the Synoptic Gospels' accounts of the Resurrection.i Having been raised an evangelical/ fundamentalist Christian, I know my stuff when it comes to the Bible. As a teenager, I was intelligent and knowledgeable beyond my years and was in adult Sunday school/Bible-study classes throughout high school (some folks even thought I should go to seminary myself‒ironic, is it not?) and can run circles around every Bible-thumper I have ever met (I may have a lousy working memory thanks to my adult ADD/ADHD, but I have a very large, fast, and well-indexed hard drive).


Crucifixion by Contradictions


As I was fact-checking myself on the discrepant accounts of JoN's arrest and “trial” before the Jewish authorities and Pilate, I came across something I had never noticed myself, nor did I recall hearing or reading about it elsewhere before, that blew my irony meter to smithereens.

To be honest, my irony mater was already a bit strained by the whole concept of an “intellectually honest” acceptance of miracles as a “historical fact.” As I was reading the account of events leading up to the Crucifixion in the Gospel of Mark (14:55-59), my irony meter exploded when I came across it, and after reading it yourself, you may see why.

“55The chief priests and the whole Sanhedrin were looking for evidence against Jesus so that they could put him to death, but they did not find any. 56Many testified falsely against him, but their statements did not agree. 57Then some stood up and gave this false testimony against him: 58“We heard him say, ‘I will destroy this temple made with human hands and in three days will build another, not made with hands.’” 59Yet even then their testimony did not agree.” (emphasis mine)

Unless the author of Mark was from another planetii, even he thinks that contradictions and conflicts between the testimony of eyewitnesses–to the same events–ruins the credibility of those witnesses! The author is not named in the text itself, but has traditionally been identified with one Mark, a companion/interpreter of the apostle Peteriii, so for convenience, I will call him “Mark.” So anyway, Mark goes out of his way to make clear that, in essence, Jesus' accusers were idiots because they couldn't even keep their lies straight. This passage also indicates that Mark's audience had a positive expectation that testimony from honest eyewitnesses would agree.

The New Testament (NT) canon familiar to western Christians has not changed much since the Latin Vulgate was assembled by the beginning of the 5th century C.E, and the whole time, there sat this little bombshell. The consensus among biblical scholars is that Mark represents the earliest surviving Gospel–with the authors of Matthew and Luke, the other Synoptic Gospels, borrowing heavily from Mark. In an additional twist of irony, apparently, Matthew and Luke, though they borrowed much from Mark, they appear to have missed Mark 14:55-59–if they had, they might have taken steps to ensure their stories agreed. Not only that, but what about all those copyists down through the centuries, did none of them ever notice the discrepancies and attempt to fix them? By the author of the Gospel of Mark's own logic, these much overlooked four verses impugn the credibility of the four Gospels themselves. This is known as someone being hoisted by their own petard (gratuitous Shakespeare reference–check). This is also a great example of special pleading–the blatant intellectual dishonesty of pointing to the discrepant testimony of the witnesses against Jesus of Nazareth as evidence that agents of Satan were out to foil God the Father's divine plan, then turn around and pretend not to notice that the same charge can legitimately be leveled against the veracity of the Gospels themselves.

Atheism a “Faith”?


Believers in the monotheistic religions make the positive claim that God exists and that their religion (whatever it may be) is the one “true” faith. Specifically, my friend believes (this is an assumption, but his phrasing of his question makes my assumption a reasonable one) that the Resurrection of Jesus of Nazareth is as much a “historical fact” as the Sack of Rome in the year 410 of the Common Era by the Visigoths, the Battle of Hastings in the year 1066 of the Common Era, which clinched the Norman Conquest of England, or the Moon landings. This claim rests upon a potentially limitless number of unstated‒and undemonstrated ‒major and minor premises, which include, but are not limited to, the existence of the God of Christian Scripture, which in turn, presupposes the existence of supernatural realms (and entities to inhabit them) not otherwise subject to natural laws. The evidentiary burden required to establish such fantastical claims is incredibly high.

All the evidence‒not just the cherry-picked bits Christians use to persuade the incurious and gullible masses, but also the evidence that reveals just how incredibly weak and thin the veneer of historical plausibility Christians have pasted onto their supernatural myths, little different from those of other cults of the Eastern Roman Empire in the first-century C.E. actually are‒have been thoroughly, skeptically, and intellectually honestly evaluated....and have been found wanting. The burden of proof is nowhere near being met, which justifies the rejection of whatever claims Christians might make as to the “historical fact” of the Resurrection, its unstated major and minor premises, and any claims it is, in turn, the basis of.

Faith” is what gives parents license to refuse evidence-based medical care to their sick childand feel that their refusal is “holy”and have that feeling endorsed and supported by their fellow believers. My atheism, my non-belief, is not a “faith,” it is a verdict, a verdict arrived at after refusing to ignore what Christians blithely ignore, and by allowing my reason to follow my natural curiosity, and the evidence, beyond the mind-numbing echo chamber of religious “faith,” and by demanding the same standards of intellectual honesty we demand of our system of justice and in any other sphere of human intellectual endeavor.

I have yet to receive a reply from my friend...





iThe inspiration for my table, though I did all the reading, formatting, and general grunt-work myself, was inspired (no pun intended) by:

Ehrman, B. D. ‘Chapter 1: A Historical Assault on Faith’. Jesus, Interrupted: Revealing the Hidden Contradictions in the Bible (and Why We Don’t Know About Them). p.8; (HarperOne: New York, 2009).


iiAt least on this planet, despite our cultural and linguistic differences, the emotions we feel and how we express them are, for the most part, universal. For instance, even if you do not speak French, the French metaphor “faux pas” will make sense when translated into one's native language. This is because our species, for the most part, shares a common inventory of emotions.


iiiSchröter, J. ‘The Gospel of Mark’. The Blackwell companion to the New Testament. Ed by. David Edward Aune. p.272–95; (Wiley-Blackwell: Chichester, U.K. ; Malden, MA, 2010).

Sunday, July 15, 2012

eBooks and I

I love to read. I love books and the written word in general. One of the greatest pleasures of my life is to curl up on my couch or stretch out on my bed with a good book‒a real book, with a binding and pages made of paper‒no batteries required. I like having good books on my shelves, and when invited into someone else's home, the presence or absence of tangible, physical reading material, and when present, the subject(s) of the reading material can often, fairly or unfairly, inform my opinion of those whose home it is. I am not rich, or even well-off, by any measure, but I am proud of the depth and breadth of the works in my library of bound books.
As long as there are at least some people that like to collect things like stamps, baseball cards, music and motion pictures recorded on a physical medium (i.e. CDs and DVDs/Blu-rays), I suspect there will also be those that will enjoy, and continue to purchase physical, bound books. From a marketing standpoint, if book publishing went entirely digital, what would become of that staple of the publishing industry, the book tour? What gets readers to clear their calendars and brave the most inclement weather to attend a talk by a favorite author promoting their latest book? From the reader's standpoint, it is not so much the chance to hear the author speak, the biggest inducement is the chance for readers to interact with their favorite authors and ask them to sign their new book. In a world of Kindles, Nooks, etc., what would be the point? Will we hear readers say things like, “See this scuff mark on my KindleTM ? I got that when I downloaded James Patterson's latest ebook”? I think not.
Because I have such a deep appreciation for, and love of, the written word, it is not surprising that I consider the effort to express my own thoughts and ideas in the same way‒and do it well, hopefully improving with practice‒a very worthwhile endeavor. When I write about subjects that depend on getting one's facts right, I take great pains to research and document my sources (using Zotero‒a superb open-source bibliographic citation program). I readily admit it is sometimes a fair description of the diligence with which I dot my “i”'s and cross my “t”'s to say that it borders on the obsessive-compulsive. To the extent that I am a bit OCD about citing my sources, my defense is that I dread being caught with my pants down in an intellectual sense. Another defense is the daily frustration I feel when confronted by the fact that most people do not seem to give a hoot that they are talking out their asses about subjects of which they are utterly ignorant; I want to share as few traits with such people as humanly possible.
I have a tattered, well-used copy of the 27th edition of the CRC Standard MathematicalTables that I acquired back in 1987 that was used at a Navy technical school I attended. The pages were falling out and were well-annotated by the students that came before me and it was being replaced with newer copies in better condition. That book went on to be further annotated by me and provide invaluable help not only to me as I studied calculus, physics, and electrical engineering on the way to earning my undergraduate degree, but to my daughters as they studied geometry, algebra, and trigonometry in high school.
Currently I have over 1.5 GB worth of scholarly peer-reviewed papers from on-line databases like EBSCO and ProQuest. I also make frequent use of Cornell University Library's outstanding arXiv repository of papers in the quantitative sciences. Additionally, being a dues-paying member of the AAAS, I also have online access to the journal Science and it's daughter publications. (My mother thinks it is a hoot that some of the mail I get from the AAAS is addressed to Dr. Northrup.) I also have a paid subscription to the online Questia library, a great source of books and journal articles in the humanities and social sciences. The vast majority of these papers are in the PDF format (with the exception of the material accessed through Questia) and are duly recorded in my Zotero library, both in the “cloud” and locally on my laptop. Using Zotero (I am not a paid spokes-person) I have the ability to copy/paste what I call “money quotes” from research papers of interest rather than going through the laborious process of re-typing them myself.
My current library of serious, non-fiction ebooks, in various formats, comes in at just under 20 GB. They include everything from things like the nine-volume Cambridge History of Christianity (relevant to a writing/research project I am currently working on) and science textbooks and references like an Introduction to Astronomy and Cosmology by Ian Morison. I was delighted to come across the ebook version of the 32nd edition of the CRC Standard Mathematical Tables and Formulae, which I quickly snapped up for reasons of nostalgia. When researching whatever project I am working on, I have found being able to copy/paste passages from items in my ebook library into Zotero for later use as invaluable an aid as it is when quoting from a paper downloaded from the journal Science.
When I write, even for this blog, I always compose and polish my work using MS Word or LibreOffice and in the infrequent and brief snippets of down-time at my day job, I sometimes work on personal writing projects (don't tell my boss). I have a smaller, portable version of my library of downloaded research papers and ebooks, as well as various ebook reader software, on a flash drive for ready access, regardless of what computer I am at. Having read this far, it would probably be no surprise to learn that I have not paid for any of the ebooks in my library...until this weekend.
Earlier (on Saturday, July 14, 2012 to be precise), I came across a post by Jerry Coyne on his excellent blog, Why Evolution is True, the discussion of which I wished to contribute to. The post involved Sam Harris's latest book, Free Will, which I have not yet purchased or read, and because I like his writing, this seemed as good a time as any to plunge into the world of legitimate (i.e. DRM-restricted) ebooks. In the interest of full disclosure, the same day I also obtained a copy of Free Will (for free) via Bit Torrent, which is how I amassed the aforementioned 20 GB of ebooks. Many of the works of popular science, freethinking, atheism, and related topics in my ebook library I also have as printed and bound volumes. When a new book by authors whose work I follow (e.g. Dawkins, Harris, The Hitch (R.I.P.), Lawrence Krause, Dennett, Pinker, et al.) comes out I look forward to curling up with the physical book.
After checking out a number of ebook vendors online, I purchased the ebook Free Will through the online ebook store Kobo and found it a thoroughly disagreeable experience that I will not, as long as I have any choice in that matter, ever repeat. If I want to read a book for my own enlightenment and pleasure, I will continue to prefer old-fashioned bound books, and will gladly pay full price for them. Ebook readers like the Kindle, Nook, the iOS readers (don't even get me started on PC vs Mac‒at least for right now) are worthless to me, whether I am reading for pleasure or for research. When researching a topic for my writing, I often have my working draft and whatever ebooks are relevant to the subject at hand open at the same time, on the same computer. Further, ebooks that cannot be highlighted or otherwise annotated, and for which the copying of text (to insure I properly cite/quote particular passages) is disabled, are of no value to me whatsoever. To my mind, the various ebook reading platforms and file formats are simply a scheme to lock consumers into a particular type of hardware that are obsolete almost as soon as they hit the shelves and will need to replaced/upgraded in lockstep with Moore's Law. A nice racket...err...I mean "business model," I'm sure.
As a student, a frequent frustration was finding a research paper that, judging from the abstract, was exactly what I needed, but was only available behind a paywall set up by the likes of Elsevier, Springer, or Wiley. It seems I was not the only one that was outraged by this (see here, here, and here, just for starters). After my experience with DRM-protected ebooks this weekend, my opinion of the ebook publishing world is now almost as low as is my opinion of Elsevier and friends, nor are their motives for going about it as they are any less base, despicable, or contemptuous of those they hope to manipulate by such practices.


Monday, May 7, 2012

Citing Sources

In my most recent post introducing my on-going series on the 2012 elections, I went on at some length about “doing one's homework.” I hold myself to that same standardwith at least some consistency, I hope. A reader might have noticed that I cite my sources in many, if not most, of my posts and thought I should give a brief account of my thinking regarding citation styles. As an undergraduate I took upper-level classes from many different disciplines: physics, engineering, geology, biology, and political science...to name a few. The default citation format I cut my teeth on was the venerable Modern Language Association (MLA) style. This makes sense when one considers that most undergrad's are introduced to writing “scholarly” papers not within their own major, but in courses taught by faculty from the English department.
One of the things I like about the MLA style is that it is set up to handle a very wide range of sources, from peer-reviewed journals to on-line videos of scientific symposia and just about everything in-between. Like many students, I used a bibliographic citation software package, specifically, EndNote. However, EndNote is very expensive and I was delighted when I learned of Zotero, a free, open-source alternative to EndNote and its pricey competitors.
For a professor grading a stack of papers written by undergrads, the MLA style is nearly ideal because the in-text citations are obvious (or very "in-your-face," depending on one's mood) and are easy to reconcile with the list of “works cited” at the end of the paper. I get that. Though I am no longer a student, I still want to show that I have done my homework in what I write, but the very thing that makes MLA great for professors grading papers, the obviousness of the in-text citations, makes a MLA formatted paper hard to read if the writer actually wants someone that is not an English professor to read it because the effect is visually quite jarring.
After some playing around with the Citation Style Language (CSL) used by Zotero, I have found that I really like the in-text citation format used by the British journal Nature. It consists of a simple, unobtrusive, superscript within the text which corresponds to the entry in the references at the end of the paper. However, the Nature style is not set up to handle nearly the same diversity of sources that MLA is, so I have had to tweak it a bit to make it work. It is still very much a work in progress and so if a reader cannot easily place the citation style I use, now they know why.

Thursday, April 26, 2012

2012-The Very Long Year-Introduction


Election years in the United States typically feel long, and 2012 is shaping up to be a very long election year. Indeed, one could even say it began as soon as the last polls closed on November 4th, 2008. This post was originally intended to be a one-off, however, like so many other posts, as I wrote it, I was constantly saying to myself "If I cover this fact or concept here, I also need to mention that supporting (or contrasting) bit from over there"‒and the whole thing snowballed from there. The original impetus for the stand-alone piece was the blow-up over Rush Limbaugh's juvenile, schoolyard bully-style attacks on the character of Georgetown University law student Sandra Fluke following her testimony before Democratic members of a House sub-committee. The subject of her testimony was contraception availability and the impact it has on women's reproductive health. Not surprisingly, as I noted above, instead of challenging the factual claims made in Ms. Fluke's testimony, something far beyond the pathetically limited scope of Limbaugh's intellect (not to mention that of his target demographic), the best he could do was resort to name-calling. The specifics of Ms. Fluke's testimony, Limbaugh's contemptible comments and those of his Right-Wing Authoritarian (RWA)i sheep, will be covered in a later post.
In this post, I throw down the gauntlet and lay out my ground rules for any discussion or debate that purports to deal with the world around us. The gloves are off. I am through coddling social, religious, and political conservatives (and when I encounter people on the left that are equally ignorant, I will be just as intellectually brutal with them too). Let this be fair warning‒from now until I revert back to precisely the same the state of non-being I was in (suffering no discernible harm by the way) for the entire 13.7 billion years from the Big Bang to just prior to my birth‒I will no longer remain silent when confronted by confident assertions made by people who have failed to do their homework. I always take considerable care in fact checking myself, in what I write and in my every day conversations with others. As a culture, we have little sympathy (for the most part) for a kid that blows off their homework in favor of playing video games and then embarrasses the hell out of themselves the next day in class when they try to bluff their way through a classroom discussion of the assigned material. Most grown-ups would consider such embarrassment their "just deserts" that would (hopefully) be a powerful motivator not to get caught with their intellectual pants around their ankles in the future, a valuable lesson in the journey toward maturity.
Paradoxically, upon reaching what can be loosely called "adulthood," the desire to avoid publicly embarrassing oneself or look like an ignoramus seems to undergo a curious inversion in some individuals. In the classrooms of our childhood and adolescence, those that pretended to know things they clearly did not were soon exposed, providing ample reasons to get our facts straight, have our ducks in a row, to dot our i's, cross our t's, and to do our homework. One would think that as adults, we would hold ourselves and others to a higher, not lower, standard of intellectual honesty than we hold children. As adults, we would certainly not want physicians that bluffed their way through medical school treating our loved ones or ourselves. Nor would we want auto mechanics working on our cars that were given passing marks for their ASE certifications and training merely because their instructors felt sorry for them. Lawyers that have not done their homework that dare appear in front of a judge are ruthlessly criticized and will have few clients and should we, as private citizens, ever find ourselves in a courtroom, whether civil or criminal, we have every right to demand that the attorney representing us has done their homework.
If our child were suffering from an unknown illness, we would demand that the treating physician leave no stone unturned or allow no assumption to go unquestioned in identifying the malady and how to treat it. In our daily lives however, when it comes to politics, social policy, etc., whether in conversations with family, friends, co-workers, or in the mass media, it is not the person that is, not to put a too fine a point on it, “talking out their ass” that is shamed and embarrassed, but rather it is the one that dares to call them on it that is vilified. By way of comparison, if you enter into a conversation with someone that has a mania for the minutiae of some subject or activity, whether it be Star Trek or NASCAR, they will soon know whether you are merely a dabbling dilettante or if you "know your stuff." If they determine that you are a mere pretender, few will hesitate to dismiss you as a "wannabe" or its equivalent.
As citizens in a democracy, one of our most consequential acts is going to the polls. The intellectual effort, the due diligence, the conscientiousness with which we educate ourselves concerning the facts of the issues before deciding who or what to vote for, are every bit as essential to the continued health of our representative democratic republic as the rigorous studies of a physician or surgeon are to the health of their patients. Paradoxically, our political discourse, at the level of individuals and in society as a whole, is rife with examples of people holding opinions that have no basis in actual facts. In the words of a 19th century humoristii, "It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." In my office, there is an older co-worker that has one of those 8 ½ by 11 inch line drawings, like countless others in circulation in offices throughout the country when photocopying and fax machines were still a novelty. The picture depicts the face of an "old lady" holding a coffee mug, telling folks "Don't believe everything you think." I think that little "poster" should be placed outside every voting booth in the country. As Altemeyer observed in The Authoritarians2, based on subject responses to other survey instruments, he was able to predict that certain people will reliably fail a simple test of inductive reasoning. What the results showed is that as long as those that actually failed the test thought the conclusion was true, they were utterly oblivious to the faulty reasoning used to arrive at the conclusion (or they thought it did not matter). This is why many mathematics teachers require their students to show their work and why some give partial credit‒because the point is to learn the complex steps involved in solving certain kinds of math problems. Once a student has the steps down, then they can concentrate on the silly mistakes we all make, like forgetting a negative sign or some such. The importance of being able make a logically consistent argument, and, not co-incidentally, know what a poorly constructed argument looks like, are a primary reason that Euclidean geometry is still taught in high schools. It may sound a bit lame or lacking in a certain "rigor," but a university I once attended even allowed students to take a course in formal logic to satisfy a core math requirement–because the goal was to teach logical thinking.
As a relatively uncontroversial (hopefully) example that illustrates the interplay between opinions and facts, and which I will later apply to more controversial ideas, is from the history of the Second World War. There have been those that maintain that Franklin Delano Roosevelt (FDR) was in possession of what we would today call "actionable intelligence" of an impending attack on U.S. forces in the Pacific. In one sense, it did not require a genius to predict that the United States cutting off exports to a resource-poor and ruthlessly expansionist Japan would not go over well and that open conflict would be the likely result. Given that the principals involved are now dead, as a practical exercise it would hard to interrogate those that were in a position to know. Regardless of how "impractical" it may be to ascertain, 70 years after the fact, who in FDR's administration knew what, if anything, and when, or if, they knew it. The only thing that, even in principle, could ever possibly decide the matter would be evidence. How one feels about the New Deal, the Lend Lease program, FDR, any other aspect of the politics of the time, is irrelevant.
The most contentious and divisive topics in the areas of public policy arise largely because of differing ideas of the real purpose of laws and government institutions in a society. Right-Wing Authoritarians (RWAs) are able to get away with many of the things they do in setting public policy because, as individuals and as a group, their feet are seldom held to the fire and pressed for their true motivations for supporting the policies they do. When I say "holding their feet to the fire" I mean something like the climatic scene in A Few Good Men3, where Lt. Kaffee relentlessly presses the self-righteous Col. Jessup until he tells the truth‒that he ordered the "Code Red," convinced the whole while that he had done nothing wrong.
The idea of "doing one's homework" when forming our beliefs and opinions is part of the more general (and very rare) virtue of intellectual honesty. Intellectual honesty not only requires that we be willing to defend our opinions and beliefs, but that we are also obliged to honestly acknowledge the motivations and assumptions underlying them. It seems that on some level, RWAs seem to instinctively know that to come right out and say the actual reasons why they take the positions they do regarding certain subjects will expose them to public ridicule. Aside from the ravings of anti-vaccination nut-jobs, most folks, RWA's included, recognize that promoting public health through vaccination programs, fluoridation of water, etc., is a legitimate area of concern for governments‒until the public health concern in question has any connection, however tenuous or remote, to sex. In developed, liberal democracies throughout the world, it is generally acknowledged that unwanted teenage pregnancies and the unchecked spread of sexually transmitted diseases (STDs) have significant economic, social, and public health costs and are no less a legitimate public health concern than preventing flu pandemics. There would be near-universal outrage if a government were to mandate the use of a particular treatment for a specific disease for any other reason than that it actually works.
If a society, or a government that claims to act in the name of its citizens, is serious about reducing the human suffering, misery, and deaths caused by smallpox, the only legitimate criteria is: do the vaccines in question actually work as advertised? If anyone were to propose an alternative, we would require that the alternative is more effective, period. Before spending tax dollars on an ad campaign to educate consumers to properly handle and cook meat in an effort to reduce food-borne illnesses, we would demand that the precautions advocated are actually effective. Proposed solutions to societal ills that seem to have little to do with whether or not the solutions in question are actually effective in fixing or mitigating the problem, should set off all sorts of alarms in the minds of all intelligent, thoughtful, and honest human beings. In my next essay, I will expose the moral pretensions of RWAs by looking at one of the hot button issues of the upcoming election


iDr. Robert Altemeyer has been researching the authoritarian personality since the mid-1960's. When the horrors of Hitler's "Final Solution" started to dawn on the rest of humanity, many sought to understand how peopleotherwise decent, normal, educated folkscan so totally surrender themselves to a charismatic leader with a brutal ideology. Novelists like George Orwell and Kurt Vonnegut explored these questions through their fiction. While some “social scientists” indulged in various forms of moral relativism (I will spare the reader a rant against “post modernism”) other social scientists felt it essential to understand what combination of individual and societal factors make it possible for the citizens in a modern nation, solidly a part of the "Western Tradition," to go along with the Holocaust, indifferent to the enormity of what was done.

Serious social scientists like Phillip Zimbardo (The Stanford Prison Experiment) and Stanley Milgram (The Milgram Experiment) explored situations and contexts in which people surrender to “authorities” and can be goaded to commit moral atrocities they would not if left to their own volition. Altemeyer 's contribution was in identifying two distinctive types of “authoritarian” personalities. Obviously there were “Authoritarian leaders,” e.g. Hitler, Stalin, Mussolini, Franco (Spain)‒they are easy to spot. All by themselves they are merely a frustrated demagogue, to be dangerous, they need followers‒lots of them. This was why much of the research into the "authoritarian personality" following the Second World War focused on authoritarian followers.

In Altemeyer's research, he defines "Right-Wing Authoritarians" to be (in part) those that submit to established authorities and rigidly adhere to conventional ways. "Left-Wing Authoritarians" would be those that submit to those that would overthrow the established, traditional authorities‒think 1960's hippie radicals‒a rare breed in the United States today. Keep in mind that while on the conventional "left-right" political spectrum, Soviet or Chinese-style Communism (note the capitalization‒when you see it I wish to make a distinction between socialism/communism and a specific instantiation of it the same way that we would describe the United States as being a democratic republic) is deemed to be the far left end of the spectrum, but for someone living under such a system, that Communism is the established authority. A zealous supporter of conventional ways and the "party line," whether in the United States or in Soviet Russia, would be a Right-Wing Authoritarian (RWA) follower.

iiAmong late 19th century American humorists, Mark Twain (1835-1910) is the most famous. However, on the quotation sites I consulted, no instances attributing the quote to Twain provided a title of the containing work. Geoff Colvin in this book Talent Is Overrated: What Really Separates World-Class Performers from Everybody Else, quotes a contemporary of Twain's, Josh Billings (his real name was Henry W. Shaw, 1818-1885) as: "It ain't so much the things we don't know that get us into trouble. It's the things we know that just ain't so." Elsewhere, Billings is quoted as (at: http://www.qotd.org/search/search.html?aid=3945&page=4): "It ain't what folks know that's the problem, it's what they know that ain't so."

The 1876 book, The Complete Works of Josh Billings, p. 286 contains the following quote: "I honestly beleave it iz better tew know nothing than two know what ain't so."[sic] The careful citing of sources seen today was not all that common in the 19th centuryexcept perhaps in scientific circles. Attempts to correct the "loose" spelling (by modern standards, not for the times it was composed) of Billings' phrasing neatly accounts for the many variations in phrasing of the sentiment expressed as later writers “cleaned up” Billings' very astute observation to make it less jarring to more modern readers.


References
1. Colvin, G. Talent Is Overrated: What Really Separates World-Class Performers from Everybody Else. (Penguin: 2010).
2. Altemeyer, B. The Authoritarians. (2006). at <http://home.cc.umanitoba.ca/~altemey/>
3. Reiner, R. A Few Good Men. Film. (1992).