Being an undiagnosed Dyslexic in 1960’s American Public Education is not for the Meek.

dyslexiaThis is an excerpt from a talk I gave on my experience with Dyslexia in the American Public Health System in the 1960’s for a Learning Disabilities Podcast

When I recall the most memorable events of my childhood education, one in particular, comes immediately to mind. It was the late-1960’s in small-town Maine at our local public school, and I was standing at the chalkboard (we had chalkboards back then, with real chalk). I became acutely aware of the laughter behind me. I stood there at the teacher’s demand, trying to figure out how to correctly punctuate and spell a short sentence, although today I cannot recall what. I remember only the laughter and the look on my teacher’s face.

As if a reprieve the bell rang, and I could hear the rest of the 5th-grade class shuffling out to the waiting buses. I had been standing at that same board for nearly an hour, in fact, since the beginning of English class. I had been called to the board (despite being one of the few students who had not his hand) to correct the spelling and punctuation of a sentence that the teacher had written on the board. I could not amend the words or the punctuation because, at that time, I could barely read. I was not sure of the spelling of any word, save my name (My mother had taught me to spell it when I was five, and I had practiced it over, and over again out loud, much to her dismay I am sure). Apart from my name, I just could not spell anything. I would often reverse letters and numbers; my math skills were far below my grade level. Had I not possessed an incredible memory, I would not have gotten as far as I had.

My friend Roy, whose last name began with the letter G, was always seated next to me in the small classes of the time. Roy was brilliant, or at least more intelligent than I was. We walked partway home from school together, as he lived one street over from me. As I mention, it was a tiny town. Roy knew of my limits and allowed me to peek at his work, enough to get by with a “D-” or perhaps, with some concentration, a “D.”

On that rainy afternoon, standing at the chalkboard, Roy could not help me. No one could. Instead, they laughed. The teacher, whose name is not essential to this story, quietly sat at her desk and glared at me. After the rest of the class had exited the room, she explained to me how disgusted she was with me. She informed me that I was a stupid child; stubborn and willful.

As she explained it, she had hoped the embarrassment I felt at the chalkboard would have snapped me out of it. It was for my good. It always seemed to be for my good. Why was it that it only created hatred of education and educators in me? How could that torment and terror be for my good? It made no sense to me, but as it had been said, more than once, I was slow, and perhaps as my 3rd-grade teacher had told my mother, retarded.

My experience, at least this particular one, took place many years before the term learning disability had entered the vernacular, at least in rural Maine. Today, I very much doubt that any teacher would behave in that manner, at that time at least, she was hardly the first or only person to explain to me in plain and simple terms that I was not a normal person. By junior high, it seemed apparent to my teachers that I was not deliberately trying to fail. Fail I did. I failed in spelling, English, maths; you name it. I flunked it. I remember in seventh grade receiving four “F’s” on my report card my first term.

My father explained that I was expected to bring my grades up, and I was grounded for the ranking period. To be honest, This was pale compared to the usual abuse we received at the hands of my father, yet it would be impossible for me to accurately describe the way I felt being blamed for my poor performance. I tried to explain to anyone I thought might listen that I had tried as hard as I could but just could not understand what they asked of me. There was more than one discussion of holding me back to repeat the grade. And then again the next year. Nothing came of it apart from causing my mother anxiety. I struggled and cheated enough to bring my classes up to a D-. My math teacher felt sorry for me, which is evident in her grading for the next two semesters. She gave me every break she could. My spelling had still not improved, and how I managed to spell any word correctly was a complete enigma.

I continued to squeak by until tenth grade. After attempting to comprehend algebra and having yet to understand basic math, and being ridiculed by my math teacher and having a book thrown in my face, I walked out of class, out of the school, and never went back. I very much doubt that anyone in the administration was sorry to see me go, as I had begun by the middle of the ninth grade to be as big a nuisance as possible. I smoked cigarettes (often in school) I drank beer in the parking lot daily, I flooded bathrooms by stuffing toilets with paper towels, I pulled fire alarms.

I had found a group of students, most were 2-3 years older, who were both willing and able accomplices in mayhem. I attended school only to meet up with friends. By now, my parents had pretty much given up on me, as had my teachers. I knew education was not for me, and at age 16 I was free to get a job and enter the workforce. Of course, without a high school education, the roles were narrowed down a bit, and work as a laborer for a construction company was about all that I could find. A year or so later, I managed, barely, to complete a General Equivalency Diploma, or GED

The GED was supposed to be equal to a high school diploma, but I can tell you without hesitation, at least back in the mid-1970’s it was not even close. And, it seems this was common knowledge because even with my G.E.D the job market did not change. I managed to become a dedicated construction worker. Still, I changed jobs every 2-3 years out of boredom. While I managed to stay employed through most of the 1970’s and 1980’s, I did not enjoy the work, and always hoped that somehow I could do more with my life.

Just three months before my twenty-eighth birthday, a scaffolding I had been working on collapsed, and a fall of some 20 feet or so landed me in the hospital. I have fractured my neck in two places, cracked several ribs, and lost much of the sensation in my left side. After two surgeries to replace vertebrae and a plate and screws to hold it all in place, my doctor told me that it would be unrealistic to consider going back to work in the construction field.

I was a slow-witted 30-year-old with no job and no future. Because the injury had occurred at work, my company insurance policy had paid or the surgeries and my salary for the past two years. I was told I qualified for vocational rehabilitation, and I went to visit a professional rehabilitation counselor named Dianne. She was very friendly and seemed genuinely concerned for me. This compassionate and concern were primarily alien to me. The effort she spent on my behalf would never be forgotten. She mentioned to me that I might make a good counselor, and I mentioned this to the vocational psychologist she had ranged for me to visit. However. Dr. S. explained to me somewhat matter-of-factly that someone with my education and background was unrealistic in my wishes to attend a college of any kind.

I recounted the discussion I had had with Dianne about becoming a counselor. She thought that, with help, I might be able to earn an associates degree in some field of social services or counseling. He explained that the desire to go to college must match ability and I simply lacked that ability. He was the expert. I left the final meeting with Dr. S. angry, mostly at myself, at how unfair it was that I had such big thoughts but no intelligence. True, I had a steady income from disability as Dr. S. had pointed out, but at just 30 years of age, my future looked bleak, at least in my opinion. But unlike my childhood, I had someone who believed in me.

I had met and married Cathy just a year before my injutry, and when I returned home from my last meeting with the vocational psychologist and explained what he had advised, she suggested I ignore the advice of the experts.

I began by applying to an associate degree program at my local college in human services. It was a struggle to be sure, but with nearly eidetic memory, and the aid of tutors, I finished the first semester with an average GPA. During the spring break while I studied for the upcoming term in the school library, I had the chance to chat with some students who were in the Special Education Program. It was then that I learned about learning disabilities, and one or two of the students suggested that I look into it.  I arranged for testing through the school, and within a month or so I heard the word for the fist time: Dyslexia. They taught me a few tricks, such as speed reading (where you read only every other word), and computer programs such as dictating software that assist in writing. Math remained a problem until in my forties I stopped looking at it as “Math” but a symbolic language. Then it made perfect sense.

After my first year in the associate degree program, I transferred to the four-year bachelor’s program at the University of Maine at Farmington, graduating in just two years with a BA in Psychology. I went go on to graduate school, earning a MSHS in Community Health Psychology, a M.Div, with a concentration in history and ancient language, and a Graduate Degree in Family Therapy.

After working in the fields of Community Health Psychology, and Behavioral Psychology, I returned to graduate school, this time pursuing public health. I earned an MPH, before entering the Ph.D. in Public Health, where I concentrated in Epidemiology (graduating at the top of my classes). Since completing my Ph.D., I have concluded post two graduate programs, in Clinical Research Administration and Applied Risk Management, and I am currently enrolled in a MHA program where I am at the top of my class.

In the time since Dr. S. made his pronouncement, I have graduated at or near the top of my class nine times, written and published five books on topics as diverse as evolutionary biology, religious history, and natural science. I have authored and coauthored dozens of journal articles on issues from healthcare to neurology and psychology and learning disabilities, and spoken at national and international conferences and symposiums on a host of topics.

I guess the moral of my story if there indeed is one, must be that when a person allows others, even the experts to define them, they are already handicapped. From the age of five until the age of thirty, I knew I was different. What that difference was, I had to find out for myself. I can only imagine what my life would be like today, had I followed the expert’s advice.

Thank you.

 

The Necessity of Moral Courage in Healthcare

Still image from police body-worn camera video of Nurse Alex Wubbels during an incident at University of Utah Hospital in Salt Lake City

This essay is in reflection of the incident from the Salt Lake City Hospital in Utah. When the police detective showed up at the ER, without consent, a warrant, or even a reason for obtaining a blood sample from a victim involved in a police-chase car accident. She was arrested for refusing to violate both the law, the constitution, and HIPPA.

As leaders in healthcare, we are forced to contend with complex ethical dilemmas. Clinicians, nurses, technicians, and managers practicing in the modern healthcare environment are met with increasingly complex problems. In situations where our desire to act rightly may be obstructed by the inconsistent values and beliefs of our coworkers, patients, and the community, we must fall back on our moral courage in demanding that right action is taken.

According to Murray (2007), understanding the importance of moral courage aids healthcare leaders to demonstrate moral courage when they face ethical challenges and the creation of an ethical environment. Moreover, a good understanding of our moral courage helps us face the inevitable ethical questions, and guides us at a time when doing the right thing is not easy or popular. To do so requires moral bravery, and a firm conviction to do what is ethical.

Samuel Taylor Coleridge (1772–1834) wrote that moral courage is that which enables us to remain steadfast and resolute despite disapproval. It is, above all other things, insisting that what is right is done, in spite of the risk of ridicule or loss of position. When moral courage fails, it often ends up splashed across television news stories and on the front pages of newspapers. It should not come as a surprise that such visible examples of moral cowardice have helped to fuel our cynical and divided society, even while the basic definition of moral courage has remained unchanged, a testament to both its necessity and is a rarity.

Beauchamp and Childress (2001) point out that ethics in healthcare have enjoyed a significant level of continuity since the time of Hippocrates, further attesting to the moral necessity to treat the sick and injured, preserving the fundamental principles of autonomy, benevolence, and justice. A recent example of moral courage can be seen in the events surrounding the head nurse at the University of Utah Hospital’s burn unit.

According to a September 1, 2017 Washington Post story and accompanying video, when a Salt Lake City Police Detective demanded to be allowed in the room of a badly injured and unconscious patient in order to take a blood sample, the nurse informed him that unless he had consent of the patient, the patient was under arrest, or a warrant for the blood sample —none of which he possessed —she could not allow him to collect a blood sample. This is not only hospital policy, but constitutional law. The detective grew angry and threatened the nurse with arrest. She still refused, citing her responsibility to protect the patient. She was arrested, placed in handcuffs, and shoved into a police car, and accused of interfering with an investigation.

Despite being assaulted by police (as is evident in the video), this nurse showed remarkable moral courage in protecting her patient against threats and coercion by outside agents. What is needed, one could say required for healthcare administrators, is a strong sense of right and wrong, and one that is reflected in action as well as word. We should not be satisfied to simply give lip service to the principles of moral courage, but to practice it in our dealings with our peers, our patients, our staff, and perhaps more importantly, ourselves.

Moral courage is not easy, nor should it be so; easy things do not require courage. Moral courage then is more than a quality, it is a practice. 

The Best of Times, the Worst of Times

Neurology_0

During the late teens and early 20’s of the last century, the War to end all wars (a misnomer if there ever was one) had ended, and the Spanish Flu (that originated in Kansas) had run its devastating course. Perhaps no period in American history saw such abrupt changes to society as the period of the early 20th century.

A few more notable events include President Woodrow Wilson’s “Fourteen Points” speech, his plan to end war forever. The Fourteen Points were enthusiastically adopted by ambassadors worldwide and became the framework for the League of Nations. At its height, the League of Nations had 58 member states. It is notable that the United States never joined, as Americans had suffered civilian casualties in the war, and many citizens wanted to keep America out of European affairs.

On September 16, 1920, suspected Italian anarchists detonated an improvised explosive hidden in a horse-drawn cart on the busiest corner of Wall Street, and nearly 40 bystanders were killed, and over 100 were injured. This was the worst terrorist attack in American history until the Oklahoma City Bombing in 1995. This act of terrorism had repercussions that included a campaign to capture and deport suspected foreign radicals. Over the next few years, thousands of accused communists and anarchists across the country were arrested in raids. The man behind the raids, a young lawyer named J. Edgar Hoover, would become head of the Federal Bureau of Investigation.

1920 was also the start of the influence of the “Lost Generation,” American writers living in Europe fallowing World War I. Books published during this period include Main Street, a critical examination of small-town America by Sinclair Lewis; This Side of Paradise, the debut novel of F. Scott Fitzgerald; and Flappers and Philosophers, Fitzgerald’s first collection of fiction. Fitzgerald also introduced editors to the work of Ernest Hemingway, who would go on to have some success as well. These significant changes in thought were not limited to printed media.

In November of 1920, the first commercially-licensed radio station began broadcasting live results of the presidential election. The real-time transmission of news was unprecedented. The world was captivated by the idea of instant news, and radios, “the talking box”, became very popular; in 1922 alone, Americans bought over 100,000 radios. The next year, they purchased over half a million. By the mid-1920’s, the number of commercial radio stations had grown to over 700, covering virtually every town in America. The Walton’s, a popular dramatic series that played through the 1970’s and early 1980’s, often showed the family gathered around the “wireless”, as the radios were sometimes referred to, listening to popular shows or the news.

While Americans reveled at the end of the war, delighting in the pages of Fitzgerald, Hemingway, and Langston Hughes, a more ominous player was entering the American consciousness. Although not a writer, it would, nevertheless, make an enduring mark on history.

The disease started with a high fever and severe headache, often leading to double vision and slowing of physical response. Often it progressed to a general lethargy, and a need for constant sleep. In acute cases, this was followed by coma and death. The condition caused swelling in the hypothalamus, the part of the brain that controls sleep, and came to be known as Encephalitis Lethargica due to the lethargy that victims experienced.

Today, a century has passed, and it is still not understood where the condition came from or its cause, although many epidemiologists and virologists agree that it was most probably viral. The first known case involved an unknown soldier from the battle of Verdun in 1916. The man was originally hospitalized in Austria, and then later sent to Paris for examination. Doctors examining him were puzzled. He slept constantly, and even when awake, did not seem fully conscious. Soon another 60 soldiers joined him. Despite examination and attempt at treatment, over half died from respiratory failure. Postmortems found the swelling in the hypothalamus.

Almost as quickly as it struck, Encephalitis Lethargica faded from history, even though some victims lived on, often in perpetual sleep. It was not until the 1960s when neurologist and author Dr. Oliver Sacks discovered a group of patients living in a hospital in the Bronx that the condition would once again be the topic of discussion. Working with the patients in the hospital, Dr. Sacks found that most would respond to some form of stimuli. For example, several responded to hearing music, while others would catch a ball if it were tossed to them. They did not, however, throw the ball back nor initiate any actions themselves. Dr. Sacks recounted the story of a patient in another part of the hospital who brought a poodle in. When the dog jumped up on a woman who had always been rigid and unmoving, she suddenly started talking about how she loved animals, and laughed as she stroked the animal. Once the dog was removed, she returned to her rigid, frozen state.

Dr. Sacks originally believed the patients were suffering from some form of Parkinson’s disease, a chronic and progressive movement disorder affecting nearly one million people in the United States. The exact cause of Parkinson’s disease is unknown, and there is presently no cure. However, there are some treatment options, including medication and surgery that help manage the symptoms of tremors in the hands, arms, and face; slowness of movement; rigidity of limbs; and impaired balance and coordination. What is known about Parkinson’s disease is that it seems to involve the malfunction and death of neurons in the area of the brain known as the Substania Nigra. One of the functions of neurons in this area is the production and release of dopamine, a chemical messenger used to communicate with another part of the brain responsible for movement and coordination. As the disease progresses, dopamine production decreases, leaving the person unable to control his or her movements.

Dr. Sacks began treating the Encephalitis Lethargica patients with the then-experimental drug Levodopa, or L-dopa, a precursor to dopamine among other neurotransmitters known collectively as catecholamines. L-dopa increases the concentration of dopamine and was found to be an effective treatment for some of the symptoms of Parkinson’s disease and other dopamine-responsive conditions in the late 1960’s. Using L-dopa, Dr. Sacks was able to temporarily revive some of his patients. Most became ambulatory and talkative; most suffered Parkinson’s-like side effects, but this was preferred to the frozen state they had been trapped in for decades. Some asked to be taken off the medication because they preferred a trance existence to waking up decades later. But even those who wished to maintain the treatment became tolerant of the treatment and returned to their frozen condition.

Dr. Sack’s work with the Encephalitis Lethargica patients is chronicled in the film Awakenings with Robin Williams.  

The cause of Encephalitis Lethargica remains largely unknown, despite causing the death of over 5 million people worldwide. Encephalitis Lethargica has not been diagnosed since the end of the epidemic in 1927. This does not, however, mean it has disappeared from human history. Few things do.

For an interesting look into the amazing neurological writings and work of Dr. Oliver Sacks, see his books The Man who Mistook his Wife for a Hat, Awakenings, and Seeing Voices. 

 

 

The (Not so) Spanish Flu, and How it Became the Deadliest Epidemic in Modern Time.

flu-635x372

I had a little bird,

its name was Enza.

I opened the window,

and in flew Enza.

     ~Children’s rhyme of 1917

In early March 2018 a mess cook at an Army base in Kansas reported to the infirmary complaining of sore throat, headache, and fever. After being checked over, the doctor could find no cause for alarm, and returned him to duty. By lunch time the infirmary was filled with soldiers complaining of similar symptoms, and by the end of the month the number of sick soldiers had grown beyond the capacity of the base hospital and a make-shift infirmary was created using an airplane hanger. By the end of the first month, 38 men had died. Influenza routinely killed 30% of those infected if they were under age 2. This influenza was killing health young men in the twenties. The soldiers were suffering with would become known as the Spanish Flu. You can imagine the worst place to have an infectious disease would be a place where tens of thousands of people were crowded together, as they were in training camps, where they prepared to go to war. The disease would spread to other training camps and eventually overseas.

Laura Spiney, in her book Pale Rider: The Spanish Flu of 1918 and How It Changed the World called the 1918 influenza epidemic, “The greatest wave of death since the Black Death.” A bit dramatic perhaps, but nevertheless accurate. What made this epidemic even more deadly, brought groups of infected and uninfected people together, and greatly helped the spread of this deadly disease, was the denials by those in power that anything was wrong.

In 1917, California Senator Hiram Johnson, an isolationist Progressive-Party-member-turned-Republican, states the first casualty when war comes is truth. At the time, Congress passed a measure, signed into law by President Woodrow Wilson, that made it punishable by up to 20 years in prison to “utter, print, write or publish any disloyal, profane, scurrilous, or abusive language about the government of the United States.” A clear violation of the First Amendment. Yet because Spain was neutral, its press was not under morality laws, so the pandemic they reported on became known as the Spanish flu.

In the cities of Minneapolis, St. Paul, Chicago, Philadelphia, New York, San Francisco, and several others, parades and events considered “important to the war effort” were not cancelled, even though they brought great numbers of people together. When public health experts demanded that such events (parades, rallies, etc.) would help spread the disease, they were reminded of the Morality Law, and the notion that “Fear kills more than the disease.”

The flu pandemic of 1918 to 1919 infected an estimated 500 million people worldwide, and claimed the lives of between 50 and 100 million people. More than 25 percent of the U.S. population became sick, and 675,000 Americans died during this pandemic, more than 10 times the number killed in Vietnam. The 1918 flu was first observed in Europe, the U.S. and parts of Asia before swiftly spreading around the world. By the time the disease had burned itself out, the total number of dead outnumbered those killed in both World Wars.

Why so deadly. Although many people had been exposed to H3N8 virus that had been circulating in the human population for about a decade, the human virus picked up genetic material from a bird flu virus just before 1918, creating a novel virus. The new virus had surface proteins that were very different, thus people’s immune system would have made antibodies, but they would have been ineffective against the virus. The high fatality was brought about by a combination of refusing to warn the public about exposure, refusing to allow newspapers to print stories of the epidemic, the novel makeup of the virus, the opportunistic bacteriological infections that thrived in weakened immune systems of many, and a process known as hypercytokinemia, or cytokine storm. Cytokines are molecules that aid cell-to-cell communication in immune responses and trigger inflammation. It is the overreaction of immune system (like fever and inflammation) that caused the high lethality of this influenza.

While most influenza viruses are dangerous for children, the elderly, or those with compromised immune systems, the 1918 strain was deadliest for those in their 20’s and 30’s in good health with a robust immune systems. The Spanish influenza strain provoked a manic immune response creating a potentially fatal immune reaction with highly elevated levels of various cytokines. In recreating the pandemic of 1918, medical research scientists used reconstructed 1918 influenza virus and injected in mice and monkeys to try to understand why it was so lethal. The animals’ immune systems responded so violently, the lungs filled with blood and fluids, essentially drowning them. Scientists have deduced that what made the Spanish flu so deadly was that it used the body’s own immune system to flood the lungs with fluid, and destroy the lining of the respiratory system, making it much easier for bacteria to infect the lungs. In this case, the healthier you were, the more violent the immune response to the virus.

In the end, the 1918 influenza virus pandemic was due to a combination of a novel virus, an official disregard for honesty to avoid damaging morale and the war effort, a cowardly press that refused to challenge this veil of secrecy that was the government’s propaganda machine. Let us home the government and those at the highest levels of power will treat the next influenza epidemic more honestly and openly.

Could it ever happen again? Could it happen again where a novel influenza virus becomes epidemic then pandemic killing millions? According to an article in The Lancet, flu pandemic like that of 1918-1919 were to break out today, it would likely kill 60-80 million (This is more than the total number of people that die in a single year from all other causes combined). The estimate stems from a new tally of flu deaths from 1918 to 1920 in different countries, which varied widely. To gauge the potential threat from the H5N1 avian influenza currently circulating among birds in Asia and Africa, the researchers reviewed the toll of the most severe previous case from 1918 as a benchmark.

Worried? You should be. If anything, the new estimates may be optimistic, according to epidemiologist Neil Ferguson of London’s Imperial College in an editorial published in The Lancet. High incomes may not protect rich countries as much as some writers have suggested. In 1918 pandemic influenza, being young, fit, and healthy was no protection. Public health researchers and epidemiologists warn that it is not a matter of if the next influenza pandemic strikes, but when.

Meanwhile, national governments slash public health funding and funds needed for health research.

 

The Eradication of Smallpox and the Helper T-Cells

immune-diagram_final

In May of 1980, the World Health Organization (WHO) pronounced, after two centuries, that the fight against smallpox had ended. This meant that there were no known cases of the disease anywhere on the planet. Many other infectious diseases have returned from the brink of extinction, but few have been so deadly as the only human communicable disease (thus far anyway) to be eradicated.

Most people are familiar with smallpox, if at all, from their history classes or, or films about the conquest of the Americas. But smallpox was not an invention of the Spanish Conquistadors, but something they had naturally grown resistant.

Medical Anthropologists believe that disease began to infect humans starting around the first agricultural settlements of the Old World. Despite such a long history, little evidence exists before 1570 BCE when it appeared in the New Kingdom in Egypt. Many historians believe that the Plague of Athens in 430 BCE and the Antoine Plague that lasted from 165-180 ACE and killed upwards of 7 million people, including the Emperor Marcus Aurelius and hastened the fall of the Roman Empire, were caused by smallpox.

Smallpox made its way to France sometime in the early 700’s. A clergyman writing around the time described the unmistakable symptoms as “a violent fever followed by the appearance of pustules.” He found that if the person lived long enough, the pustules developed scabs, after which the person survived. By the time the disease had reached the rest of Europe, it had spread across Africa and Asia. Smallpox was not a disease of the poor or the aged or the young. It was an equal opportunity killer. And in the Old World, smallpox killed approximately 30% of those who contracted it, while many more were left disfigured or blind. As devastating as smallpox was in the Old World, it was far more destructive in the New World. One significant reason for this great difference was in the immune systems of the two groups.

Helper T-cells are one of the most important cells in that comprise our adaptive immunity, and are a significant part of almost all our adaptive immune responses. Helper T-cells activate cytotoxic T cells that target and kill invading organisms. They also activate B cells that secrete antibodies and macrophages, ingesting and destroying microbes. But a relatively recent discovery is that the American natives possessed a different variant of the Helper T-cell than Europeans. Whereas Europeans maintained an immune response that developed over thousands of years of fighting off bacteria and viruses, the peoples of the Americas had developed an immune system that dealt with the daily concerns of parasitical infection. While their T-cells were better at recognizing invading parasites, combating parasites, they may not recognize many of the organisms the Europeans had adapted to and had brought with them.

The population of the Americas in the pre-Columbian era is estimated to have been between 25 and 60 million people. Of those populations, approximately 95% died as a result of European diseases. At the same time, the Europeans did not have the same kisses as the American natives and were spared the bulk of the infectious disease is of the Americas. With one or two significant exceptions: the reason that the Europeans were immune to so many possible infections in the Americas stems from the fact that Europeans have been caretakers of domesticated animals for several thousand years and had adapted to many common diseases found in domesticated animals that were used for food sources; adapted, but certainly not immune.

The American natives did not possess the same domesticated animals. Cattle, pigs, and horses were absent from the Americas. While the Spanish and Portuguese explorers met with some resistance from natives, the Incas and the Triple Alliance (Aztecs) had largely succumbed to smallpox by the time they arrived. Historians now calculate that the indigenous populations of both American continents were reduced by about 90% from the introduction of smallpox. The Great armies that the conquistadors faced were already greatly weakened by disease. This lesson was not lost on military leaders that would follow (Lord Jeffrey Amherst, the commander-in-chief of British forces in North America during the French and Indian war advocated handing out smallpox affect blankets to his native foes).

This helps explain how a group of fewer than 200 men, over half of which were on foot, manage to defeat an empire, at that time the largest in the world, with a reported standing army of over 70,000.

Today, thanks to the efforts of public health practitioners, medical researchers, and physicians, smallpox is, so far as we know, relegated to a bygone era. In fact, if you were born after 1972, you would not have received a smallpox vaccine. Still, what would happen if smallpox for re-introduced into American society today? The Variola major virus that causes smallpox killed a third of people infected, and was so virulent it claimed the lives of over 300 million people, just in the 20th century alone. Although estimates vary somewhat, the total number of persons killed by smallpox may exceed 2 billion. With an infectious disease so deadly, could ever make a comeback? If smallpox were to make a comeback, there would likely be two possible sources: intentional release, or unintentional release.

The intentional release is the release of the virus by a terrorist or group into a population. The unintentional release is through the thawing of the frozen virus. The residents of a Siberian town lost 40% of its population to smallpox in the 1890’s. The victims had been buried in the upper layers of permafrost along a river, whose banks have begun to erode, due to floodwaters from a warming climate. Russian scientist are concerned that the graves of anthrax-infected cattle can also be found across Russia, including in areas where the ground has thawed two feet deeper than normal. Recently, the thawing of one of the anthrax-killed animals claimed over 100 Reindeer and hospitalized 13 people living in the area. Scientists speculated that the Reindeer were succumbing to the high temperatures, ate the thawed remains of an infected carcass frozen for many years. From there, the infection was passed to the herders.

Regardless of the cause, The Centers for Disease Control and Prevention (CDC)’s Strategic National Stockpile is the nation’s largest supply of life-saving pharmaceuticals for use in a public health emergency severe enough to cause local supplies to run out. The stockpile ensures the right medicines and supplies are available when and where needed to save lives, and this includes, you may be relieved to know, a reported 400 million doses of smallpox vaccine.

On Being a Disease Detective

conan-detective_8078

 

Recently, I was asked what a disease detective did. While I considered my response, a situation came to mind from a few years back that captured the process perfectly. The following events are true, the names of been changed to protect the innocent, namely me.

Shifu: This can’t be right, it must be a coincidence.
Oogway: There are NO coincidences.
Shifu: Yes, you said that before.
Oogway: That was no coincidence either. (Kung Fu Panda)

My wife developed a nasty cough one October. Listening to her breath sounds through a stethoscope, I could hear a distinct rattle, indicated fluid of some sort. Her chief complaint was feeling tired, and having a headache. This was just a week after we had completed an outdoor endurance event that involved a great deal of mud, freezing water, and military-style obstacles. It felt fine the day after the event, but by the middle of the next week, started to experience the symptoms. At my urging, she made an appointment with her nurse practitioner. In retrospect, I should have accompanied her to her appointment.

The practitioner explained that she had a cold. She prescribed Tylenol, fluids, and bed rest. Two weeks later, the symptoms had worsened, and she made another appointment. This time the clinician ordered a cardiac stress test. I am uncertain as to her reasoning, however, I believe in being thorough. My wife mentioned having participated in an event that involved mud and water just a few days before the onset of symptoms but was told it had nothing to do with her illness. When I heard this, I was shocked and any respect I had for this practitioner, albeit slight, disappeared. Any good disease detective understands that foundationally, there simply are no coincidences.

Two weeks passed, and the appointment for the cardiac stress test came and went. My wife continued working, but looked and felt run down. Her breathing continued to be labored, and she lacked energy. When the practitioner called and explained that the test had revealed no abnormalities, my wife asked the logical question, “What will they do now?” The response (and I still have difficulty believing this had I not heard it myself) was “What do you mean?”

My wife said, “I still have the symptoms, what are you going to try next?” The nurse said that the practitioner had not made any follow ups or recommendations. At this point, although I knew it might annoy my wife, I called the office back and demanded to speak to the practitioner. Trying to remain calm, I explained that my wife’s symptoms were intensifying, and that I felt fairly strongly that she had a lung infection, most probably Campylobacter, and that she had most probably been infected during the event she had described, now over two months prior. Again, I was told that this was merely a coincidence, and that frankly she did not feel comfortable being questioned by the spouse of a patient. At this point, my patience evaporated. I explained to her that I was not just the spouse of a patent; I was a clinical epidemiologist and an environmental health scientist. Further, I said, I believed allowing a patient suffer for over two months while proffering no answers was ridiculous, and furthermore, to have a patient have to repeatedly ask “What do we do next” for some course of treatment bordered on incompetence. Her demeanor softened, and she then asked what I would recommend. I suggested that, as the primary symptom was breathing related, I would want to see a pulmonologist as soon as possible.

I accompanied her to the pulmonologist office, which conducted a thorough history. When I mentioned the event in October, and explained what we had done for nearly 9 hours, she put a pen down and looked at us. “That is most likely the cause,” she remarked. My wife then repeated what her practitioner had said, that it was just a coincidence. The pulmonologist smiled and said, “There are no coincidences.”

I could not help but smile as I had been saying this for almost three months.

After the pulmonologist conducted a few tests (Peak flow, Diffusion gas), and my wife was given an inhaler, after which peak flow was tested again. Finally, she stepped away from the room while my wife redressed, she asked me what I thought the pulmonologist would say. I said that I thought perhaps she would diagnose a lung infection, probably Campylobacter, which was picked up during the event, which had taken place on a farm that once raised swine. I said that she would most probably be given an antibiotic, most likely Zpac, a corticosteroid dilatator, and a rescue inhaler. The pulmonologist returned and, told us that she believed my wife had picked up a Campylobacter infection from the mud and water during the event. She was surprised that this was not diagnosed months ago, but that it should be relatively easy to treat. She handed my wife three scripts, explaining that the first one was for a medicine called “Azithromycin, or Zpac.” There were also scripts for a corticosteroid, and a rescue inhaler, which could be used as needed.

My wife turned to me and said, “you think you’re so smart don’t you?” My wife now sees a board-certified internist that I vetted. It never ceases to amaze me why people would not keep going to a mechanic who could not fix their car but will continue to see a healthcare provider who cannot seem to properly diagnose their illness. Especially when you give them the answer, not just once, but several times.

As Oogway said, “There are no coincidences.” 

Opinions are Not Facts, and Things are seldom as they first appear.

nm1010-1100-F3

“The rise of childhood obesity has placed the health of an entire generation at risk.” ~Tom Vilsack

 Often of late, we hear non-experts make sweeping pronouncements on subjects from healthcare and education to socioeconomics. Seldom do they add anything to the conversation, instead muddying the already perturbed waters. While we certainly cannot fault someone for proffering an opinion, however, we must remember that they are no more than an opinion, or an idea.

Of course, some ideas are more lasting than others, and on rare occasions, these facts have held up to rigorous scientific scrutiny. An example is the Theory of Evolution. It should be noted that the term “theory” used in science means a well-substantiated explanation of the natural world; one based on facts repeatedly confirmed by observation and experiment. This explanation seems to confuse some people, and others, being intellectually dishonest, attempt to imply that a scientific theory is simply a guess. Therefore, it seems that some explanation is needed to outline the steps in the scientific method: Observation, Hypothesis, Experiment, Analyses of Data, Draw Conclusion.

Observation and Hypothesis

A hypothesis is a reasonable guess, based on knowledge or observation, as to how something occurs, or, more frequently, how one variable might effect another (if at all). For example, one hypothesis put forward in the 1970’s was that for many people, the regular use of tobacco products resulted in Cancer of the lungs. The original question may have been something like “Why have medical practitioners seen a drastic increase in lung cancers in the past 20 years.”

After formulating a series of research questions based on observation, the researcher creates a Hypothesis and a Research Question (RQ) that will attempt to address the original question. They then make a testable prediction, test, and then analyze the resulting data. The hypothesis will need to be tested, retested, and tested again before it is accepted as being true.

The research question may have looked like this: Is there a statistically significant association between subjects who smoke tobacco and the development of lung cancer. The RQ was simply a hypothesis (a pretty good one if we are honest). However, hypotheses are proven and disproven all of the time. The use of hypotheses is critical in the scientific method.

Theory

 A scientific theory consists of hypotheses that, after repeated testing, have been shown to be true, at least thus far. Theories in science are one of the pinnacles of provability because the theory must never be shown to have been wrong in its prediction. Scientific theories can and do evolve; this is not indicative that the original theory was incorrect, just incomplete. An example is Newton’s theory of gravity. While Newton could show what gravity did, he did not fully understand why it did it.

Laws

Scientific laws are short, sweet, and always true. They are often expressed in a single statement and rely on a concise mathematical equation.  Laws are accepted as being universally true, and are the cornerstones of science. They must never be wrong (that is why there are many theories and few laws). If a law were ever to be shown false, any science built on that law would also be wrong. For example, E=MC2 has been shown to be true, at least thus far; however, it is not in itself a law, but a central tenet of the Theory of Special Relativity. It is not inconceivable (although the odds are astronomical against) that all space is not the same.

Examples of scientific laws include one of the simplest, yet frequently misunderstood laws, Thermodynamics. Thermodynamics involves the properties of temperature, energy, and entropy. Boyle’s law describes how the pressure of gas increases as the volume of the container decreases, in other words, the force exerted on the container by the compressed gas. Both of these laws can be shown to be true mathematically. While scientific laws describe a formula that explains what will occur, they may not always describe why it will occur. A good example is the Law of Gravity.

u6l3c1

According to Sir Isaac Newton’s Law of Universal Gravity, “Every point mass attracts every single point mass by force pointing along the line intersecting both points. The force is directly proportional to the product of the two masses and inversely proportional to the square of the distance between the point masses.” What this means in plain language is quite a bit simpler: gravitational force is inversely proportional to the square of the separation distance between any interacting objects, the greater the separation distance, the weaker the gravitational forces will exist between them. As two objects are separated from each other, the force of gravitational attraction between them also decreases (Inverse square law). This law accurately predicts what will happen, however, not why. Is gravity simply a result of mass warping space-time as described by Einstein? Is it the result of gravitons (a hypothetical that mediates the force of gravitation in the framework of quantum field theory)? You see how a law explains the what, but not the how (the how is in the theory, which of course begins with a question and, you guessed, a hypothesis).

One theory (which many accept as law) is that obesity is caused by overeating. While a poor diet and large quantities of food poor in nutritional value may lead to overeating, the fact is obesity is a very complex interplay that involves energy consumption, energy expenditure, hormone production, activity, resting metabolic rate, dietary intake, and several other factors including the type of fat a person has and how much of it. But as health science has come to realize, fat is not just fat and obesity has complex causes.

The human body has two major types of fat tissue: white fat and brown fat. In the human body, fat is used for energy, the maintaining of body temperature, regulating hormones, and as a store of energy for later use.

White fat is found around the kidneys and under the skin in the buttocks, thighs, and abdomen and is used to store energy, manufacture and store hormones that control appetite and hunger, and helps regulate appetite. Comprised on a single lipid, white fat has significantly fewer blood vessels (explaining why it appears white), and it is the main form of fat in the body, created from connective tissue. Because white fat is critical in the creation of estrogen, adiponectin, and leptin (hormones that help regulate hunger including leptin), it is a major endocrine gland. White fat also has numerous receptors for glucocorticoids, growth hormones, and important stress hormones including adrenalin, norepinephrine, and cortisol. Finally, white fat also produces inflammatory substances, as adipocytes of obese individuals tend to attract macrophages (part of the immune response), promoting inflammation, and increasing the release of inflammatory factors that influence insulin resistance. In obese persons, large numbers of immune cells infiltrate adipose tissue, promoting a low-grade chronic inflammation, which has been associated with hypertension, heart disease, and metabolic syndrome. Because white fat acts as an endocrine organ, it seeks self preservation. If it begins to disappear through diet or exercise, it can suppress the release leptin (which tells the hypothalamus to instruct us to stop eating), tricking us into believing we are still hungry.

bigstock-Leptin-Hormone-Effects-100418774

Brown fat is found on the upper back of healthy human infants and adults. Brown fat (also known as the good fat) releases stored energy as heat when we are cold. However, it also produces some inflammatory chemicals. As you can see, all fat is the same, and there can be many reasons why people are overweight or obese, including some medical conditions that make it very difficult, if not impossible, to reduce the amount of fat we have, or to alter the balance of white fat to brown fat.

Medical conditions that are well known to cause or to complicate overweight or obesity include some genetic syndromes and endocrine disorders (remember, white fat is an endocrine organ). The most prevalent of these include Prader-Willi Syndrome, a genetic disorder caused by the loss of function of specific genes. Prader-Willi Syndrome causes weak muscles, poor feeding, and slow development in infants, yet causes young children to be constantly hungry, often leading to obesity and type 2 diabetes. Because the endocrine system produces hormones that are critical in maintaining the balance of energy in the body, endocrine disorders can cause overweight and obesity.

Hypothyroidism. People with this condition have low levels of thyroid hormones. Thyroid hormones, in particular, triiodothyronine (T3) and its prohormone, thyroxine (T4). T3 and T4 affect nearly every physiological process in the body, including metabolism, body temperature, and heart rate. These tyrosine-based hormones are produced by the thyroid gland, and underproduction (hypothyroidism) significantly slows metabolism. For people with hypothyroidism, even drastic reductions in caloric intake are insufficient to normalize weight. People with hypothyroidism also have difficulty producing body heat, and as a result, have a lower body temperature, and are unable to use stored fat efficiently as energy.

Cushing’s Syndrome. People suffering from Cushing’s have high levels of glucocorticoids, including cortisol, in their blood. Abnormally high cortisol levels trick the body into thinking it is under constant stress. As a result, people have an increase in appetite and the body will store more fat. The symptoms of Cushing’s can include high blood pressure, abdominal obesity (Apple shape) but with thin arms and legs, often with a round face. Cushing’s can develop after taking some medicines, or if the body manufactures too much cortisol.

Type-2 Diabetes. In healthy individuals, insulin released from the pancreas activates glucose uptake in peripheral organs. Insulin is activated by the rise in postprandial (after eating) rise in blood glucose. Insulin promotes increased glycolysis and respiration and enables the storage of glucose and lipids through the stimulation of glycogenesis (glucose is added to glycogen), lipogenesis (acetyl-CoA is converted to fatty acids), and protein synthesis (generating new proteins). Insulin also reduces degradation and recirculation of carbohydrates and lipids by inhibiting gluconeogenesis (forming glucose from non-sugars) and lipolysis (breakdown of lipids). The causes associated with Type-2 diabetes are complicated and include a number of preconditions including genetics, life style, diet, and family history.

Tumors. There are some tumors, such as craniopharyngioma (a type of brain tumor derived from pituitary gland embryonic tissue), that occurs most commonly in children, can also cause severe obesity, because the tumor grows near or invades the part of the brain that helps regulate hunger.

Medicines. There is a host of medications that can cause weight gain, and may lead to obesity. Medicines such as antipsychotics, antidepressants, antiepileptics (used to treat epilepsy), and antihyperglycemics (used to lower blood glucose) can cause weight gain, which can lead to weight gain and obesity.

Abnormal Microbiome. As previously discussed, obesity has been characterized as an imbalance between energy intake and energy expenditure and involves a complex process that involves genetic, biological, and environmental factors. One area that has attracted renewed interest is the Microbiome, or as it is commonly called, the “Gut Biome.” The human digestive track contains over 100 trillion microbial cells. These calls have the essential role of digesting food, processing and extracting energy from digesting material, and regulating metabolism. When this microenvironment is altered, through poor diet, medications, or the introduction of chemicals (pesticides, herbicides, improper foods) the microbial ecosystem fails to function properly. Scientists have researched how the introductions of unhealthy feed, residue from pesticides or herbicides, have been associated with increased metabolic and immune disorders in animals. In humans, the molecular interactions linking this microbiome with host energy metabolism, fat accumulation, and disruptions to immune response have been identified.

When you see someone who is overweight or obese, before you conclude they overeat and have a poor diet, remember, things are seldom as simple as they first appear.