The 48 Percent Part 3 - Slipping Into the Future

Right now, the United States economy, and that of the world in general, is in turmoil–arguably the worst economic disaster since 1929. In my day job, working on the public side the workforce development arena (i.e. Job Service), I see many people that feel strangely out-of-place seeking new employment in the 21st century world, many of whom were with their former employer for decades.

Even my co-workers are not immune to this sense of bewilderment. The particular branch of my state government for whom I work is in the midst of transitioning, after nearly a decade using Windows XP® and MS Office XP/2003, to Windows 7/Office 2010. In speaking with my managers and co-workers, I have likened the experience of my co-workers starting up their computers one morning, only to be faced with a completely unfamiliar operating system, to what a blind person would likely experience if they awoke one morning to find their furniture has been rearranged while they slept.

One of my younger co-workers observed, as have I, that many job-seekers that claim to know nothing about computers act as though their ignorance is something of which they can be proud. I am quite willing to grant a pass to those of my parent’s generation (born 1939), those that are at or near retirement age. However, for those young enough to have not yet started elementary school when Sputnik 1 launched in 1957 (i.e. those born after, say 1952/3), there is, barring some notable exceptions that I will address shortly, no excuse.

My day-to-day supervisor thinks that with the rate computer technology is advancing, it will not be long before computers will be completely voice-activated, without the need for keyboards or pointing devices (mice, trackballs, etc.), with perhaps the exception of a finger. The sort of computer-human interaction depicted in Steven Spielberg’s film The Minority Report may not be far off the mark in this regard. History has shown however, that while the pace of technological change is in some regards even quicker than some “futurists” predicted, reliably predicting the direction of the changes is more of an art than a science. Case in point: from the vantage point of the 1950’s through the mid 1970’s, the real 21st century does not look much like the 21st most thought it would just several decades ago.

While the changes that have occurred have not been in the anticipated directions, they are every bit as momentous, if not more so, than the long-delayed flying cars in everyone’s garage, orbiting space colonies, and manned Mars bases. Just because we do not yet have flying cars for all, and are still waiting on the lunar vacation resorts, does not mean that the computer revolution of the last 15-20 years was a mere passing fad, as some plainly thought it was. Apparently, a significant number of our fellow citizens (I am almost exclusively concerned with Americans in this piece) were not paying attention.

Another thing that leaves me stupefied is when people act outraged or offended when pointing out that their lack of computer skills will preclude their opening certain occupational doors, as though it is completely unreasonable to expect them to put forth the required effort to acquire new skills in order to make themselves marketable to employers. I am not talking about “average” people knowing how to create their own webpage using html code, or build a relational database from scratch. What I am talking about is “average” folks being able to follow the instructions for filling out an online employment application or finding information about a potential employer by visiting their website.

It is now time for a few caveats and to crank down the “arrogant bastard” tone. Many people (I am referring to adults throughout) find computers intimidating. One oft-repeated frustration I have heard is that some find computer screens too busy; with just too many things to visually track. This produces stress and anxiety, which in turn makes it that much harder to focus and just becomes a debilitating positive feedback loop. Our technological society has created a stimulation-rich environment that did not exist 50+ years ago.

One result of our stimulation and information-rich society has been a marked increase in the identification of, and diagnoses for, cognitive impairments like ADD/ADHD and autism. One mundane reason for this increase is that the diagnostic tools used to detect such impairments have become much more refined as our understanding of how the brain works has grown. The other reasons are best explicated by way of analogy. Brain circuits involved in spoken language were certainly shaped by our evolution, but written language (and formalized mathematics as well) are technological innovations, not things for which the human brain was equipped by natural selection.

There remain a few isolated hunter-gatherer societies in the world today, living much the same way that all of humanity lived prior to the invention of settled agriculture and the first cities. Even after the invention of farming and cities, several more thousand years elapsed before writing was invented. In such pre-literate societies, whether today or in depths of human pre-history, the condition known today as dyslexia was irrelevant in the environment in which such people lived. Additionally, to the extent that “dyslexia” was irrelevant, it would be reasonable to say that, in some sense, “dyslexia” did not exist.

A quote, most often attributed to William James, but first made known to this author via Mr. Spock of Star Trek fame, summarizes the situation nicely, “a difference which makes no difference is no difference.” If a difference in brain wiring, known to lead to dyslexia in an environment in which mass literacy was the norm, were to occur in the brain of a person belonging to a pre-literate society, that difference in brain wiring would, in fact, “make no difference” in the life of the individual in their native culture. It is only in an environment where mass literacy is the norm does such a distinction in brain wiring make an important difference.

People today find themselves bombarded by external stimuli (mostly visual and auditory) that were not part of the environment in which Homo sapiens evolved. Navigating the modern world calls upon cognitive abilities that were seldom, if ever, called on just a generation or two ago. Is it any surprise to anyone, especially experts in the cognitive sciences, that the totally novel sensory and cognitive environment of the early 21st century is revealing heretofore unrecognized, and previously unneeded, strengths and weaknesses in the cognitive endowments of individuals.

One of the names this sort of “sensory overload” goes by is Sensory Processing Disorder (SPD), but there are other conditions that exhibit similar symptoms. Much of the relevant peer-reviewed literature available to this author have titles such as:

  • Sensory processing disorder: Any of a nurse practitioner's business?[i]
  • The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines[ii]
  • "Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity[iii]
  • Employment and Adults with Asperger Syndrome[iv]
  • Information-processing deficits and cognitive dysfunction in panic disorder[v]
  • Perspectives on sensory processing disorder: a call for translational research[vi]
  • The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking[vii]

ADD/ADHD provides a useful analog for this in that it was first identified in children and only later was it realized that the condition persists into adulthood. Many adults, myself included, developed coping strategies and were completely unaware the ADD/ADHD they had as a child was still with them until they were properly tested. It may be that because children are often placed into environments and situations they would rather not be in, that it is in these situations that their struggles stick out like the proverbial sore thumb and provide a “handle” on which to begin a scientific enquiry. Adults, on the other hand, have traditionally had far more control over what situations and environments they choose to be in and after many years of doing so, the avoidance often becomes unconscious. With the majority of research in this area focused on children, I fear that many adults are slipping through the cracks. Adults who, for whatever reason, avoid anything that is too intellectually or cognitively challenging will be at a disadvantage, both in terms of self-sufficiency and providing good role models for their children.

I suspect that the apparent difficulties many adults have in functioning in our current stimulation and information-rich environments are an admixture of genetic/physiological[viii] factors, socioeconomic/cultural factors, and personal choices. Groping for remedies to this situation, without having the vaguest notion of how much of a particular population’s variation in demonstrated cognitive capacity is attributable to which causal factors, is a recipe for failure. Such a “wait it out” attitude essentially admits that some sizable faction of human beings is somehow incapable of being educated or elevated in their competency and understanding. So why not give up right now?

I do not wish to live in a world where a certain percentage of humanity is relegated to a permanent intellectual underclassnot if it can be helped. Putting my “arrogant bastard” hat on again, I also know that many people are, in fact, ignorant, incurious, and complacent. A frequent frustration in my day job is that I am in a bit of a catch 22 in distinguishing those that have a legitimate learning/cognitive disability from those that are merely lazy and/or complacent. Either way, I risk doing a grave disservice to those I work with daily. I do not tolerate fools gladly, and far too often, I must bite my tongue and refrain from telling someone who just could not be bothered to keep up with our changing world, that unless they have a note from their doctor, I am not going to carry their lazy ass.



[i] Byrne, Mary W. "Sensory processing disorder: Any of a nurse practitioner's business?" Journal of the American Academy of Nurse Practitioners 21. 6 (2009): 314-21.

[ii] Eppler, Martin, and Jeanne Mengis. "The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines." Information Society 20. 5 (2004): 325-44.

[iii] Gupta, Reetika, Sucheta Nadkarni, and Stephen J. Gould. ""Is this Site Confusing or Interesting?" A Perceived Web site Complexity (PWC) Scale for Assessing Consumer Internet Interactivity." Advances in Consumer Research 32. 1 (2005): 42-50.

[iv] Hurlbutt, Karen, and Lynne Chalmers. "Employment and Adults with Asperger Syndrome." Focus on Autism & Other Developmental Disabilities 19. 4 (2004): 215-22.

[v] Ludewig, Stephan, et al. "Information-processing deficits and cognitive dysfunction in panic disorder." Journal of Psychiatry & Neuroscience 30. 1 (2005): 37-43.

[vi] Miller Lj Fau - Nielsen, Darci M., et al. "Perspectives on sensory processing disorder: a call for translational research." Frontiers in Integrative Neuroscience 3. 1662-5145 (Electronic) (2009).

[vii] Morelli, Frank, and Pamela A. Burton. "The Impact of Induced Stress Upon Selective Attention in Multiple Object Tracking." Military Psychology 21. 1 (2009): 81-97.

[viii] When I say “physiological,” I have in mind such things as the environment within the womb, nutrition, or any other biological effect that is not directly attributable to an individual’s genotype.

Comments

Popular posts from this blog

eBooks and I

Rediscovering the Joy of Science Writing

Baloney Check—"Nazis are the same as Socialists"–Part 1