Sunday, July 15, 2012
Monday, May 7, 2012
Thursday, April 26, 2012
Saturday, December 3, 2011
I just got home (late on Dec. 2) from a five-day conference of Veteran's Employment Representatives on the new materials for the Transition Assistance Program (TAP) for service members about to leave the military and I wanted to get my thoughts out there with little polishing. The conference, my "classmates," and especially the "trainer," Dr. Beverly Hyman were the best part of the experience. Dr. Hyman and her husband are the co-authors of the book, How to Know If It's Time to Go: A 10-Step Reality Test for Your Marriage–which I will be buying and reading–and I have been divorced almost ten years now.
I have written before of my ADD/ADHD and anyone that knows me would not be surprised that I was that annoying student that was always raising their hand to contribute something to the discussion. There were times I had to force myself to remain still and let others have a chance. It is not (I hope) that I am really that self-absorbed, it is just that, despite my ADD/ADHD, or (here comes the possible epiphany part) because of it, I seem to have a knack for finding connections, metaphors, or analogies between seemingly dissimilar ideas or concepts. One of the diagnostic features of ADD/ADHD is a deficit in "working memory" and that is me to a "T." Might it be that my deficit in working memory forces me, and perhaps others with ADD/ADHD, to draw on their long term memories, or to use a computer metaphor, to compensate for not having enough RAM (where programs and data are stored while they are being used, and which is cleared when the computer is powered down) by having a fast and very well indexed "hard drive" (long term memory) that is able to make rapid connections to things it already knows?
Jet fighters are designed in such a way that, aerodynamically, they are just on the verge of being uncontrollable. Key to the survival of military jets in air-to-air combat is their maneuverability, whereas predictability and stability are what you want in a commercial or military that carries people or other cargo. Perhaps those with a good working memory are able to stay focused and "on task," like a well-designed passenger aircraft. Combat aircraft on the other hand are too difficult for a human to control and it is only the ability of computers to make tiny millisecond by millisecond adjustments to the flight controls that they are stable at all, but when they do need to maneuver, they can do so incredibly quickly-in much the same way that someone with ADD/ADHD can quickly see how a new piece of information might relate to something they already know.
I have some thinking and reading to do. I will certainly have more to say later.
Saturday, November 5, 2011
the plot would still involve a black hole (cue ominous music)
the word “supernova” (hereafter: “SN”) would still be used on-screen
the word “hypernova” (i.e. a really bad-ass big brother to a mere supernova) could be used as well
an opportunity would be created for some really awesome, not yet depicted on the big screen, FX "disaster porn"
So, beginning at the beginning...
Throughout their lives, all the stars we can see on a clear night (and even those we cannot) exist in a state of equilibrium between the outward force of the energy released by the fusion of lighter elements into heavier ones in the star's core, and gravity, which acts to collapse the matter of the star to a central point. When a star is being born from a molecular cloud, as it condenses, the pressure and temperature at its core increases steadily until at about 15 million K,i thermonuclear fusion begins. Once the fusion process has begun, the the outward radiation pressure of the energy thus released begins pushing back against gravity, settling into a state of what is called hydrostatic equilibrium and there matters stay, at least as long as there is enough of the "fuel" needed to keep fusion going and pushing back against gravity. In the case of our own Sun, it is just barely middle-aged for a star of its size and luminosity and will go on largely as it is today for another 5 billion years.
All good things must come to an end and so it is too with the fusion gravy train. The problem is that a star's supply of elements that can be fused is not infinite. As lighter elements are fused into heavier elements, those heavier products make their way to the star's core, but that is where the highest temperatures and pressures, necessary for fusion, are also. The kind of fireworks called for by the movie going public in their lust for great science fiction disaster porn needs a star (or stars) many tens of tens of times the mass of our sun.
The scale of the disaster porn I'm referring to can only come from the most energetic cosmic phenomena observed by humans, the only thing more powerful yet conceived by science is the Big Bang. These cosmic phenomena are called gamma-ray bursts (GRB's). Gamma-rays are the most energetic, highest frequency form of electromagnetic radiation. To provide a tangible sense of the energy of gamma-rays, anyone who has ever had x-rays taken has likely worn one of those heavy, lead-lined "bibs" to shield the more delicate parts of our bodies (e.g. the reproductive organs). Gamma-rays can penetrate up to several centimeters of lead, over ten times the thickness of lead used to shield patients getting x-rays.
Some (though not all) observations of GRBs are associated with supernovae (the plural form of the singular “supernova”). All by themselves, supernovae are powerful enough (in the visible light part of the spectrum) that for decades, astronomers have used a particular species of supernovae, designated by astronomers as a "Type Ia" supernovae, as part of the "cosmic distance ladder." The circumstances under which they form make them a "standard candle"ii that can briefly outshine the galaxy of which they are a part. Conventional supernovae release these incredible amounts energy in a blast that is a more-or-less spherical wave front of high-speed particles, visible light, and hard radiation, fading over a period of days or even weeks. What makes GRBs different is that they release at least as much energy as supernovae, but do so in seconds, and in two focused directional "beams" thought to be aligned along the magnetic poles of a freshly-minted black hole.
(Any real astrophysicists reading this, please be kind in the hate mail I'm sure you will want to send after what I say next.) As a strictly visual metaphor (the physics behind it, other than the magnetic field connection, is quite distinct) think about the auroras visible from Earth's extreme north and south latitudes, the point being that "stuff" can interact with magnetic fields. Another similar cosmic phenomenon are "pulsars," thought to be (on very solid observational grounds, mind you) rapidly rotating neutron stars (i.e. failed black holes) whose magnetic poles are offset from their rotational poles, creating a "lighthouse" effect, both in the radio and the visible light spectrum, and are detectable for many light-years. In fact, their signals are so precisely timed, when first detected by radio telescopes in the 1960's they were referred to as, somewhat tongue-in-cheek, LGM's for "little green men" as there as there was no known natural phenomena that were that regular.iii
There are several, and not necessarily mutually-exclusive, posited "progenitors" for GRBs,1 but there is much that is still unknown–leaving a lot of room for science-fictional speculations. Without going into too much detail (a temptation I frequently face) there are short and long-lived (remember, time is relative) GRBs. The so-called "long" GRBs (those lasting more than 2 seconds), have always been observed in connection with supernovae, specifically, a kind of supernovae called collapsars. Collapsars are supernovae whose progenitors are stars of 40+ solar masses, massive enough to have developed a solid iron core, iv that collapses straight (well, almost) into a rapidly-rotating black hole without pausing at the neutron-star phase. Short GRB's–those lasting less than 2 seconds, and commonly only fractions of a second-making them hard to detect as one has to be looking at it as it happens to catch it–are thought to result from the merger of a black hole with a neutron star, or of two neutron stars.
If I were writing a script, I would use the black hole + neutron star plot device (you could still have the "red matter") as some sort of "ultimate weapon" gone cosmically awry. You could even use the old science-fiction trope of the "Frankenstein complex"–either the hubris of the scientists involved thinking they can accurately predict and control where the beams will end up pointing, or the militaristic leaders who ignore the warnings of their scientists. The creators could also be some unknown threat from elsewhere (can you say "sequel"?). In the altered time-line of the reboot, presumably, the Borg (unarguably the baddest adversaries the Federation has ever faced) are still out there and the events of the Star Trek: Enterprise episode "Regeneration" still happened. I'm seeing possibilities J.J.! Just think about the disaster porn FX you could get out of the massive jets from the GRB pulverizing star systems as it bores a path of interstellar destruction through the galaxy!
I am too young (at 47, I seldom get to say that much anymore) to remember the original Star Trek (ST:TOS) during its first-run on NBC. Somewhat against type, I was drawn to science fiction by my interest in, and love of, science–and especially astronomy–while it seems that many scientists of my generation were first turned on to science by a love of science fiction. A local TV station started showing reruns of ST:TOS in my first year in junior high, and perhaps not coincidentally, the same year Star Wars: A New Hope was in theaters. Add to this the presence in my school library of the James Blish short-story adaptations of all 79 episodes of ST:TOS and the Alan Dean Foster novella-length adaptations of the animated series (ST:TAS) and you have a perfect trifecta. I was such a voracious reader (I still am) that I had read all of Blish's adaptations long before I saw all the ST:TOS episodes.
As an adult science-geek, one of the things I enjoyed about Star Trek: The Next Generation (ST:TNG) was that the writers included bits of "sciencey" stuff that one could read about in recent issues of Scientific American, like "cosmic strings" or "dark matter," cutting-edge science stuff. What astronomy undergrad would turn down a chance to review the sciencey bits of a script (if done right, keeping the whole of the script a secret would not be that hard).
Heck, I'm available-and rather cheap, too!
1 G. Vedrenne, Gamma-Ray Bursts: The Brightest Explosions in the Universe, Chapter 8, (Springer; In Association with Praxis, Berlin; New York; Chichester, UK, 2009).
iIn the Kelvin, or absolute temperature scale, water freezes at 273.15 , boils at 373.15, and all molecular/atomic motion ceases (that is one definition of "temperature") at 0 (zero) Kelvin (note the absence of the "⁰" degree sign). For a star the size and composition of our Sun, the core temperature (partly a function of its mass) is 15 million K.
iiIf a light source lies an unknown distance away, but you know that the source is a 100 watt light bulb, and you have a light meter, the kind used in old cameras, all you have to do measure the power of the light that reaches your meter, do a little math, and viola, you know how far away the light bulb is.
iiiSome pulsar signals are so stable (remember the accuracy claim of the first quartz watches?) make them suitable for use as cosmic “radio beacons” for interstellar navigation. The plaques carried by the Pioneer 10 and 11 probes as “greeting cards” to any advanced extraterrestrials that might come across it millions of years in the future illustrated the position of Earth with respect to 14 pulsars using a binary-type code.
ivThe creation of elements, via fusion, higher than iron on the periodic table require an input of energy rather than having energy left over to power the star and push back against gravity-which makes iron, as Isaac Asimov titled an essay on supernovae, the "dead-end middle."
Tuesday, November 30, 2010
Right now, the
Even my co-workers are not immune to this sense of bewilderment. The particular branch of my state government for whom I work is in the midst of transitioning, after nearly a decade using Windows XP® and MS Office XP/2003, to Windows 7/Office 2010. In speaking with my managers and co-workers, I have likened the experience of my co-workers starting up their computers one morning, only to be faced with a completely unfamiliar operating system, to what a blind person would likely experience if they awoke one morning to find their furniture has been rearranged while they slept.
One of my younger co-workers observed, as have I, that many job-seekers that claim to know nothing about computers act as though their ignorance is something of which they can be proud. I am quite willing to grant a pass to those of my parent’s generation (born 1939), those that are at or near retirement age. However, for those young enough to have not yet started elementary school when Sputnik 1 launched in 1957 (i.e. those born after, say 1952/3), there is, barring some notable exceptions that I will address shortly, no excuse.
My day-to-day supervisor thinks that with the rate computer technology is advancing, it will not be long before computers will be completely voice-activated, without the need for keyboards or pointing devices (mice, trackballs, etc.), with perhaps the exception of a finger. The sort of computer-human interaction depicted in Steven Spielberg’s film The Minority Report may not be far off the mark in this regard. History has shown however, that while the pace of technological change is in some regards even quicker than some “futurists” predicted, reliably predicting the direction of the changes is more of an art than a science. Case in point: from the vantage point of the 1950’s through the mid 1970’s, the real 21st century does not look much like the 21st most thought it would just several decades ago.
While the changes that have occurred have not been in the anticipated directions, they are every bit as momentous, if not more so, than the long-delayed flying cars in everyone’s garage, orbiting space colonies, and manned Mars bases. Just because we do not yet have flying cars for all, and are still waiting on the lunar vacation resorts, does not mean that the computer revolution of the last 15-20 years was a mere passing fad, as some plainly thought it was. Apparently, a significant number of our fellow citizens (I am almost exclusively concerned with Americans in this piece) were not paying attention.
Another thing that leaves me stupefied is when people act outraged or offended when pointing out that their lack of computer skills will preclude their opening certain occupational doors, as though it is completely unreasonable to expect them to put forth the required effort to acquire new skills in order to make themselves marketable to employers. I am not talking about “average” people knowing how to create their own webpage using html code, or build a relational database from scratch. What I am talking about is “average” folks being able to follow the instructions for filling out an online employment application or finding information about a potential employer by visiting their website.
It is now time for a few caveats and to crank down the “arrogant bastard” tone. Many people (I am referring to adults throughout) find computers intimidating. One oft-repeated frustration I have heard is that some find computer screens too busy; with just too many things to visually track. This produces stress and anxiety, which in turn makes it that much harder to focus and just becomes a debilitating positive feedback loop. Our technological society has created a stimulation-rich environment that did not exist 50+ years ago.
One result of our stimulation and information-rich society has been a marked increase in the identification of, and diagnoses for, cognitive impairments like ADD/ADHD and autism. One mundane reason for this increase is that the diagnostic tools used to detect such impairments have become much more refined as our understanding of how the brain works has grown. The other reasons are best explicated by way of analogy. Brain circuits involved in spoken language were certainly shaped by our evolution, but written language (and formalized mathematics as well) are technological innovations, not things for which the human brain was equipped by natural selection.
There remain a few isolated hunter-gatherer societies in the world today, living much the same way that all of humanity lived prior to the invention of settled agriculture and the first cities. Even after the invention of farming and cities, several more thousand years elapsed before writing was invented. In such pre-literate societies, whether today or in depths of human pre-history, the condition known today as dyslexia was irrelevant in the environment in which such people lived. Additionally, to the extent that “dyslexia” was irrelevant, it would be reasonable to say that, in some sense, “dyslexia” did not exist.
A quote, most often attributed to William James, but first made known to this author via Mr. Spock of Star Trek fame, summarizes the situation nicely, “a difference which makes no difference is no difference.” If a difference in brain wiring, known to lead to dyslexia in an environment in which mass literacy was the norm, were to occur in the brain of a person belonging to a pre-literate society, that difference in brain wiring would, in fact, “make no difference” in the life of the individual in their native culture. It is only in an environment where mass literacy is the norm does such a distinction in brain wiring make an important difference.
People today find themselves bombarded by external stimuli (mostly visual and auditory) that were not part of the environment in which Homo sapiens evolved. Navigating the modern world calls upon cognitive abilities that were seldom, if ever, called on just a generation or two ago. Is it any surprise to anyone, especially experts in the cognitive sciences, that the totally novel sensory and cognitive environment of the early 21st century is revealing heretofore unrecognized, and previously unneeded, strengths and weaknesses in the cognitive endowments of individuals.
One of the names this sort of “sensory overload” goes by is Sensory Processing Disorder (SPD), but there are other conditions that exhibit similar symptoms. Much of the relevant peer-reviewed literature available to this author have titles such as:
- Sensory processing disorder: Any of a nurse practitioner's business?[i]
- The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines[ii]
- "Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity[iii]
- Employment and Adults with Asperger Syndrome[iv]
- Information-processing deficits and cognitive dysfunction in panic disorder[v]
- Perspectives on sensory processing disorder: a call for translational research[vi]
- The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking[vii]
ADD/ADHD provides a useful analog for this in that it was first identified in children and only later was it realized that the condition persists into adulthood. Many adults, myself included, developed coping strategies and were completely unaware the ADD/ADHD they had as a child was still with them until they were properly tested. It may be that because children are often placed into environments and situations they would rather not be in, that it is in these situations that their struggles stick out like the proverbial sore thumb and provide a “handle” on which to begin a scientific enquiry. Adults, on the other hand, have traditionally had far more control over what situations and environments they choose to be in and after many years of doing so, the avoidance often becomes unconscious. With the majority of research in this area focused on children, I fear that many adults are slipping through the cracks. Adults who, for whatever reason, avoid anything that is too intellectually or cognitively challenging will be at a disadvantage, both in terms of self-sufficiency and providing good role models for their children.
I suspect that the apparent difficulties many adults have in functioning in our current stimulation and information-rich environments are an admixture of genetic/physiological[viii] factors, socioeconomic/cultural factors, and personal choices. Groping for remedies to this situation, without having the vaguest notion of how much of a particular population’s variation in demonstrated cognitive capacity is attributable to which causal factors, is a recipe for failure. Such a “wait it out” attitude essentially admits that some sizable faction of human beings is somehow incapable of being educated or elevated in their competency and understanding. So why not give up right now?
I do not wish to live in a world where a certain percentage of humanity is relegated to a permanent intellectual underclass–not if it can be helped. Putting my “arrogant bastard” hat on again, I also know that many people are, in fact, ignorant, incurious, and complacent. A frequent frustration in my day job is that I am in a bit of a catch 22 in distinguishing those that have a legitimate learning/cognitive disability from those that are merely lazy and/or complacent. Either way, I risk doing a grave disservice to those I work with daily. I do not tolerate fools gladly, and far too often, I must bite my tongue and refrain from telling someone who just could not be bothered to keep up with our changing world, that unless they have a note from their doctor, I am not going to carry their lazy ass.
[i] Byrne, Mary W. "Sensory processing disorder: Any of a nurse practitioner's business?" Journal of the
[ii] Eppler, Martin, and Jeanne Mengis. "The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines." Information Society 20. 5 (2004): 325-44.
[iii] Gupta, Reetika, Sucheta Nadkarni, and Stephen J. Gould. ""Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity." Advances in Consumer Research 32. 1 (2005): 42-50.
[iv] Hurlbutt, Karen, and Lynne Chalmers. "Employment and Adults with Asperger Syndrome." Focus on Autism & Other Developmental Disabilities 19. 4 (2004): 215-22.
[v] Ludewig, Stephan, et al. "Information-processing deficits and cognitive dysfunction in panic disorder." Journal of Psychiatry & Neuroscience 30. 1 (2005): 37-43.
[vi] Miller Lj Fau - Nielsen, Darci M., et al. "Perspectives on sensory processing disorder: a call for translational research." Frontiers in Integrative Neuroscience 3. 1662-5145 (Electronic) (2009).
[vii] Morelli, Frank, and Pamela A. Burton. "The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking." Military Psychology 21. 1 (2009): 81-97.
[viii] When I say “physiological,” I have in mind such things as the environment within the womb, nutrition, or any other biological effect that is not directly attributable to an individual’s genotype.
Saturday, September 4, 2010
At Beyond Belief 2006, when speaking about his work on phantom/paralyzed limbs and the denials that can accompany such phenomena, V.S. Ramachandran related a humorous anecdote about a study that asked people if they were above or below average in intelligence. Ramachandran pointed out the fact that like height, the distribution of IQ scores in a population take on the shape of the iconic “bell” curve (called by mathematicians a “normal” or “Gaussian” distribution). The salient property of Gaussian distributions of variations in a population is that 50% of the population will be below the average value (or arithmetic mean) for the trait in question and the other 50% of individuals in the population in question will be above the average value.  The punch line comes when Ramachandran reveals that 98% of the survey respondents indicated that they considered themselves to be of above average intelligence, a statistically impossible result which indicates that 48% of humanity are “in denial of their own stupidity.” His point was that even people without brain injury engage in classic Freudian, defensive denials every day. Though the study may have been fictional, it is plain that only 50% of humanity can be of above average intelligence and therefore, the other 50% must fall below that average.
A consequence of the above dilemma showed up in an interview with Lord Martin Rees, Patricia Smith Churchland, A.C. Grayling conducted by Roger Bingham of The Science Network. At about 00:32:00 into the dialog, Dr. Churchland notes that there is (primarily in the United States), and coming from both the far left and the extreme right ends of the cultural/political spectrum, a disturbing undercurrent of anti-intellectualism in general, and of anti-science in particular. She confessed that she does not know how to reach the sort of people who get their news from Rush Limbaugh and/or a certain American news channel that she left unnamed.
The brute fact that half of humanity will always fall above the normalized “average” intelligence (measured by whatever criteria one chooses) and the other half will fall below that “average,” poses a profound problem for skeptics, atheists, scientific rationalists, humanists, and anyone hoping to increase the role of evidence-based critical thinking in the discourse of our democratic republic. Proposed solutions usually involve some combination of better schools and/or teachers, more educational television programs, more popularizations by capable scientists or other public intellectuals, more scientifically accurate
No matter how “politically incorrect” the above question may seem, the question posed is very worth answering. Asking the question or attempts to answer it is not part of some sinister eugenics program or elitist, racist agenda. In Breaking the Spell, Daniel Dennett advocated that all the tools in the arsenal of modern science be applied to understanding religious faith and practice. Dennett also maintained that religious faith is unique in that it is currently off limits to the kind of inquiry he was proposing, hence his choice of title. While it is certainly true that the “taboo” against looking too closely at religious beliefs for fear of dulling their sheen is probably the strongest such taboo, inquiries into other areas of human existence can also set off alarms in some people. It is quite likely that what follows, if taken up by experts in the relevant disciplines (I make no claim to be one), may be an important adjunct to the investigation Dennett proposed. If the question under consideration were “what percentage of human beings can reasonably be expected, based on the heritability of the required traits and assuming an environment that provides the opportunity to excel, would be able perform at the level of an Olympic athlete?”, the question would be entirely uncontroversial. That is precisely the kind of question posed (but not answered) in this series of essays.
As Steven Pinker argued so effectively in The Blank Slate, human beings are not infinitely malleable. Relevant to the topic of these essays is the question of what exactly goes into making someone a skeptical, critical thinker (though not necessarily a scientist)? Obviously, “intelligence” or as it is colloquially called, IQ (after Intelligence Quotient)–an admittedly slippery term–is part of the picture, regardless of how “intelligence” is defined and/or measured. The openness and intellectual honesty demanded by rational inquiry, is essential to not only science, but to history, law, medicine, ethics, or any other field of human intellectual endeavor (not to mention the functioning of a healthy democracy), and is antithetical to any form of authoritarianism. The degree to which someone fits the authoritarian personality type certainly matters too. What is the nature/nurture split for authoritarianism? Likewise, curiosity and inquisitiveness are also essential to being an informed, rational citizen in the 21st century. However, there are hundreds millions of human beings in the
Some of the relevant research in all these areas has been conducted already, with the greatest amount devoted to the heritability of IQ. A limited amount of research has been performed on the heritability of authoritarian attitudes and very little research has apparently been done on the nature/nurture mix for things like critical thinking, tolerance for ambiguity, or curiosity and inquisitiveness. Nearly all attempts to engage the public to further science and reason seem to assume that a majority of those not already so inclined or engaged, can indeed be reached. Those convinced that the future of humanity critically depends on the application of science and reason to the problems that vex this planet would do well to test the assumptions underlying efforts to communicate science and reason in order to better direct their efforts. None of this should suggest in the slightest that if the number of people that can, in principle, be reached is below a certain minimal threshold, the effort is not worth it. Nonetheless, we need to have some idea of how successful we can reasonably expect to be, all other things being equal.