You say Potato, I say.. Sponge Cake?

medical

Since starting this blog a few years back, I have had some interesting feedback on the diagnostics and research of the odd ailments and conditions, and yet, I’ve gotten more than a few questions that really defied an easy answer. Not surprisingly, these were not health or medically related. Here is a short post that I’ve compiled based on two such questions.

Question one: “What is a Clinical Epidemiologist?”

Question two: “Do you work with bugs or skin?”

Let me try to address these questions and clear up any confusion concerning epidemiology, entomology, and dermatology.

Epidemiology

A clinical epidemiologist is a medical professional who studies diseases and the way they spread. Primarily, they use research to improve clinical and patient-oriented healthcare. Some clinical epidemiologist work in labs, or in a forensic capacity conducting investigations of disease outbreaks. Whether they work in a hospital setting, a research laboratory, or in the field, ultimately, the focus is on reducing the occurrence of negative health issues.

Some of the duties that a clinical epidemiologist would perform include overseeing research on various diseases or outbreaks (like Ebola or influenza), compiling data for publication, developing procedures or policies related to disease control in medical facilities or research laboratories, consulting with healthcare facilities, nursing homes, or hospitals to minimize infectious disease issues, help develop educational resources to minimize the spread of diseases (hand washing, wearing a seatbelt, etc.), consulting with public health department on infectious and chronic disease issues, designing and developing research studies, interpret and analyze medical data for other researchers, investigate the results of medications on patients to better understand safety and effectiveness, and working in the field to locate the source of disease outbreaks from viruses (Ebola), bacteria (Pestis), vibrio (Cholera), or zoonotic (parasites).

A clinical epidemiologist may choose to work in a field that is more specialized, for example, environmental health (pollution or environmental toxins), chronic conditions (cancer, metabolic syndrome, obesity..my speciality), infectious conditions (Viruses, Bacteria, Vibrio, or Zoonotic). Beyond these specialties, clinical epidemiologists spend a good deal of time consulting and conferring with other medical professionals including physicians, public health officials, researchers, and health administrators. Many also work in research facilities or universities, and the Centers for Disease Control and Prevention (CDC), the Federal Emergency Management Association (FEMA), the Food and Drug Administration (FDA) and other government agencies employ a large number of clinical epidemiologists.

Entomology

An entomologist is a natural scientist that studies insects. Entomologists study the classification, life cycle, distribution, physiology, behavior, and ecology and population dynamics of insects. Many work in agricultural and urban environments. Everything we know about pollinators like bees, we own to entomologists. They also enforce quarantines and regulations on certain imports, performing insect survey work, and consult on pest management. The greatest numbers of entomologists are employed in some aspect of economic or applied entomology that deals with the control of harmful insects, this includes methods of controlling insects like mosquitos while protecting beneficial insects like bees.

Perhaps the greatest entomologist of all time is Edward O. Wilson, who has written many books, among them some of my personal favorites Sociobiology: The New Synthesis (1975), On Human Nature (1978), and The Ants(1980). His most thoughtful work to date, Consilience (1998) asked some fundamental questions about science. According to Wilson, all knowledge, from the humanities to the natural sciences, can be unified. Along the lines of field epidemiology, a relatively new area of research is called medical entomology, and works with public health professionals, epidemiologists, and other medical professionals in the areas of disease prevention.

Dermatology

A dermatologist is a medical doctor that specializes in treating diseases and conditions of the skin, hair, nails, and the mucous membranes (lining inside the mouth, nose, and eyelids). Dermatologists treat over 3,000 different diseases, including skin cancer, eczema, acne, psoriasis, infections, and some autoimmune disorders.

So, in conclusion, I do work with bugs (sort of) but only those very few that cause diseases. Other than knowing that the skin is the largest organ of the body, I know next to nothing about dermatology.

I hope this clears up any confusion.

Two Kirks, One Ship

james_kirks_evil_counterpart

I recall as a child watching an episode of Star Trek where a malfunction in the transporter created two Captain Kirks. Although I was very young at the time, I recall wondering, if the transporter disassembled and then reassembled matter how could there be two full-sized Captain Kirks? Wouldn’t the best possible outcome be two 50% sized Captains Kirk? Okay, so I was a bit of a science nerd as a kid. Of course, we soon learned that these Kirk’s were not the same. One was the normal Kirk, the other, evil. Even these exact duplicates were not the same, although even then it was puzzling as to why all the evil aspects of a person would be confined to the duplicate.

In reality, how do we know which Kirk was the original? It seemed puzzling, at least to me. And as it turns out, this puzzle was not limited to viewers of Star Trek.

In ancient Greece, the man said to have founded the city of Athens was the legendary king named Theseus. Athens at the time was dependent upon the strength of its ships, and as you might have guessed, Theseus was a great naval tactician who fought, and won, a great many sea battles. As a result, and perhaps to capture the magic, the people honored the memory of their king by putting his great ship, afterwards known as the “Ship of Theseus” on display as a memorial. The ship, we are told, was there for hundreds of years.

As time passed, the wooden ship began to deteriorate. Although Athens is in a very weather-stable area, rain and sun, and the progression of time began to weaken the planks and the substructure, at which time, workers would replace bits and pieces. Now, over a few hundred years, most of this ship of Theseus had to be replaced in order to keep it looking original. The question is, at what point does the ship stop being the Ship of Theseus, and start being just some other ship that was, more or less, LIKE the ship of Theseus?

While the Ship of Theseus probably never actually existed, nevertheless, it has become a philosophical puzzle for thousands of years: the problem of identity. In looking at both Kirk and the ship, what can we truly know about a thing? How do these things change? If the thing changes, can we even agree on what exactly has changed? (After all, the two Kirks looked identical, acted identical, until the one started being evil).

Concerning the ship, how many planks or other bits can be replaced before we no longer have the same ship? Let us suppose, for the same of discussion, that the ship was comprised of 400 pieces or bits. We can agree that replacing one or two bits would make little difference. What if, over time and due to weathering, and other forces, we needed to replace all the deck boards? Is this still the Ship of Theseus? How about after the people had to replace a third of all the bits and pieces? How about half? Lets say that after 400 years, and the ship was so deteriorated over time that, by now, all but the keel had been replaced? Can we say this is still the Ship of Theseus? Would it even matter?

The truth is simply that there is no agreed upon, objective answer. Either to the two Captains Kirk, or to the question of the “Ship of Theseus.” So at this point you may be thinking, okay, this was an interesting thought experiment, but isn’t this a medical blog? And you would be correct, however, there is always method in the madness, or in this case, this digression.

By the time that you reach the age of 68 to 70 years, every single cell in your body will have died and been replaced. You are simply not the same person you were 50 years ago, or even 10 years ago. Like Kirk’s doppelgänger, or the ship replaced over hundreds of year’s bits at a time, you are no longer the original. This is simply a part of aging. But that begs the question, why do we age? Why can’t we, like Captain Kirk, simply replicate our cells in the same condition as the original?

Each time our cells replicate there must also be copies made of our DNA, a complex molecule that is essential for cellular function and reproduction. One theory is that eventually, all this replicating of cells will eventually lead to catastrophic errors that cause cellular death. This is the reason given that our skin begins to lose its elasticity, our muscles begin to atrophy, and our organ systems begin to fail. Maintenance of our biological tissues includes maintenance of the structural integrity of our DNA, critical for cell survival, and just as important, accurate transcription to the daughter cells that replace the original cells.

The concern lay in an enzyme called DNA polymerase alpha. This enzyme constructs DNA molecules by assembling nucleotides. It is essential to the replication of DNA. Generally, these enzymes work in pairs, in order to create two identical DNA strands from each single original DNA molecule. However, if there are any errors in the information transferred during DNA synthesis, it is copied to the new cell’s DNA, another way to look at it, instead of an old cell with original DNA and a new cell with copied DNA, when each double stranded DNA is copied you end up with two double strands, each containing one template strand and one transcribed strand.

When the cells divide, it leads to two 1/2 new cells. Should the original cell have errors, they would be in either or both of the new daughter cells. Alterations in the fidelity of DNA polymerase alpha could result in a progressive degradation in the information transfer during DNA synthesis, eventually affecting a range of cellular components.

Like School children playing telephone, once an error creeps into the story, that error is passed along, and entirely new errors are added to the story based on the original error. The result, much as it was in grammar school, we end up with a message (transcribed in our DNA) that makes no sense at all. When this happens, the daughter cell ceases to function correctly. Eventually this leads to error catastrophe, where the new cell is replicated with such dysfunction that it is essentially useless. At this point, the cell may initiate a process of self-destruction known as apoptosis, or it may be signaled to self-destruct by neighboring cells (Apoptosis is one area of cell death being looked it for certain diseases). Interestingly, cancer cells seem to maintain these DNA errors, yet avoiding apoptosis. No one really understands why, at least right now (Another area of research is looking at errors in the replication of telomeres, but that’s a different post).

In the end, we are all very much like the great ship. We are replaced, bit-by-bit, until the inevitable decay overcomes the ability to replace the parts. If only we could replicate ourselves like Captain Kirk. Of course we would run the risk of stepping off the transporter pad evil.

 

 

 

 

How Leptin, Ghrelin, and other Hormones can Thwart your best efforts to get “thin.”

leptin-1

In 1950, researchers at Maine’s Jackson Labs produced a strain of Mice labeled OB/OB, or Obese/Obese, as both parents carried a recessive mutation that was expressed in Obesity. This recessive mutation was found to cause the mice to become three times as large as normal mice, and their appetites were ravenous. They were inactive and suffered from some chronic conditions including obesity, constant appetite, a diabetes-like syndrome of hyperglycemia, glucose intolerance, elevated plasma insulin, subfertility, impaired wound healing and an increase in hormone production from both pituitary and adrenal glands. They were found to have slowed metabolic processes and lower than normal body temperature. The obesity is characterized by an increase in both the number and the size of adipocytes (fat cells). The excess in weight and excess fat continued even after restricted diet that was sufficient for normal weight maintenance in normal mice.

Dr. Jeffery Friedman, a molecular geneticist at the Rockefeller University, became curious as to why the defect in just one gene (of the approximately 25,000 genes) would have such dramatic effects on the mice’s weight, appetite, and behavior. He began searching for the gene he suspected was causing this obesity in mice in the late 1980’s using a recently discovered methodology called positional cloning. Positional cloning has become the process of choice when searching to identify genetic mutations that underlay pathology using Mendelian inheritance. After eight years of searching, Friedman was able to identify and cloned the OB gene in mice. He found the corresponding gene in humans, and by 1995 had managed to purify the product expressed by the OB gene; a hormone he called Leptin.

Friedman discovered that through leptin, and the regulation of food intake and metabolism, fat functioned as an endocrine organ. Therefore obesity can be best understood as a problem of biology. Leptin is a 16-kDa polypeptide that is primarily produced in white adipose tissues and secreted into the blood stream. Like many hormones in the mammalian body, leptin acts to maintain homeostasis. How fat controls the physiology and metabolism is through secretion of leptin into the bloodstream where it acts on parts of the brain through influencing the neurotransmitters used to communicate, including melanocortin peptides that allow the brain to regulate food intake and energy expenditure. When fat is reduced, the levels of leptin in the bloodstream diminish, stimulating appetite and suppressing metabolism until the fat mass has been restored. Conversely, when fat mass increases, leptin levels also increase thus suppressing appetite until the weight is brought back to homeostatic level. This system maintains control of adipose mass and is extremely difficult to circumvent.

Because leptin modulates the amount of adipose tissue in the body, it acts on specific receptors in the hypothalamus to inhibit appetite through both counteractive and stimulatory mechanisms through interaction with other hormones such as ghrelin (that tells your body you are full), and neuropeptide Y as well as the effects of a cannabinoid neurotransmitter called anandamide which stimulates appetite. Leptin also promotes the synthesis of an appetite suppressant called α-melanocyte-stimulating hormone. When the fullness hormone ghrelin is suppressed, the stimulation of neuropeptide Y and anandamide, the result is an almost addictive desire to eat. Researchers are finding that as fat mass decreases, the level of plasma leptin falls, thus stimulating appetite until the fat mass is recovered. There is also a decrease in body temperature; energy expenditure (metabolism) is also suppressed.

Leptin also plays an important role in regulating and modulating the onset of puberty. For example, girls from undernourished societies and underweight women take longer to reach puberty than heavier girls. In fact, girls who are too thin often fail to ovulate during menstrual cycles. Reproductive growth and fat stores are therefore vital in the regulation of reproduction. In the athletic field, dancers and other energy-intense training young women sometimes cease menstruating due to a lack of adequate adipose tissue.

As it turns out, Leptin is so central to the treatment of obesity that it has begun to be developed as a therapy for some forms of obesity including life-threatening metabolic disorders such as lipodystrophy (a medical condition characterized by abnormal or degenerative conditions of the body’s adipose tissue), some forms of diabetes, and hypothalamic amenorrhea (caused when the hypothalamus gland slows or stops releasing gonadotropin-releasing hormone). Dr. Friedman’s landmark research has created a flood of research in laboratories around the globe, resulting in tens of thousands of research articles.

Made by the body’s fat cells, leptin is now understood to be the critical hormone of a very complex endocrine system that not only maintains the body’s weight; it exerts controlling effects on other hormones that control glucose metabolism and insulin sensitivity, and even immune function. In fact, the neuroendocrine system itself is greatly influenced by leptin. Dr. Friedman came to understand that the OB mutated mice completely lacked the gene for leptin; they ate until they became obese, simply because their brains went into a permanent starvation mode. And when given leptin supplements, these OB mice ate less. They lost weight and became more active. It was also found that they responded better to insulin, a significant factor in Type II Diabetes and Metabolic Syndrome. Mouse models are used to better understand disease states because the genetic, biological and behavior characteristics closely resemble those of humans, and many symptoms of human conditions can be replicated in mice. It should come as no surprise that in humans would leptin mutations who are also unable to produce leptin are massively obese and suffer from some chronic conditions that drastically improve with leptin therapy.

When our bodies are functioning properly, excess fat cells will produce leptin, which in turn triggers the hypothalamus to lower the appetite response. This allows the body to utilize the fat stores to feed us. Unfortunately, when someone suffers obesity, they will have too much leptin in the blood. This can result in leptin insensitivity and lead to leptin resistance. Because the person feels hungry despite not utilizing stored energy, they keep eating, and the fat cells produce more leptin to signal the hypothalamus to send the satiation signal, leading to increased leptin levels and more leptin insensitivity.

This does not mean that all persons suffering from obesity have similar mutations. In fact, some studies have shown that only a small subset of people with obesity was able to lose weight with leptin therapy. Most humans, like most mice, produce ample amounts of leptin, but their bodies simply were resistant. One possible means of reducing this leptin resistance is through combination therapy with other hormones, in articular, Amylin.

Amylin is a peptide hormone that is co-secreted along with insulin from the pancreatic β-cell and has been found to be deficient in patients with Type-II Diabetes. By combining leptin with amylin, at least in clinical studies with obese patients, a reduction of 13% bodyweight has resolved. If you are struggling with obesity or weight, these hormones can affect your overall health as well as your scale. Excess body fat causes problems with hormonal secretion, neurochemistry, and even immune function. While leptin has been called the starvation hormone, this is not accurate; a better term might be the satiation hormone as leptin inhibits hunger, regulates energy balance, and prevents the body from triggering hunger responses when energy is not needed.

Low levels of leptin are rare but do occur. There are a very few people who suffer from a genetic condition called congenital leptin deficiency. This condition prevents the body from producing leptin. Without leptin, the body is confused and thinks it has too little body fat, signaling the brain to consume resulting in an intense, uncontrolled hunger. This often manifests in childhood obesity and delayed puberty. The treatment for leptin deficiency is currently leptin injections. However, significant research is now taking place.

 

 

Pluralism, the Profit-driven Healthcare Market, and Health.

With the new incoming president and administration, many are clamoring for an end to the affordable care act. If they’re honest, politicians would admit that the act was neither affordable, nor really focused on health care so much as pork projects, and was not so much an act as a law designed to force the sale of a service. Currently Americans spend over 17% of our nation’s wealth on medical care, the vast majority of it focused on treating existing conditions, with very little spent on preventing those same conditions. Not that we would listen anyway. We are not really big on prevention, as that would require us to modify our behaviors.  Meanwhile, most public health organizations are forced to beg money to keep even their basic prevention programs afloat, while fast-food (an oxymoron if there ever was one), and junk-food sales have reached an all-time high. Sounds more than a bit self-defeating doesn’t it?

Our nation currently ranks number one in healthcare costs per person and America tops everyone in cost of care relative to gross to GDP. The U.S. scores poorly on many fronts, ranked 11th out of 11 in the Commonwealth Fund (2014) while outspending all other first-world nations in terms of the amount it spends on healthcare. You would think that for this enormous cost, we would be receiving the best health care in the world. Not if we use life quality, infant mortality, longevity, and general health as measures.

If we take even a tertiary look at the three highest rated nations for healthcare in the world (according to the World Health Organization), at least currently, we can formulate some idea of how we might go about replicating these top healthcare systems, rather easily and relatively cheaply, at least compared to what we have now.

What we need then, or so it would seem, are some examples to use from among the top healthcare nations, a “Best of the Best.” According to many researchers, the top ten healthcare nations are, in order, #10-Sweden, #9-Switzerland, #8-South Korea, #7-Australia, #6-Italy, #5-Spain, #4-Israel. All share similarities, for example a focus on prevention, public health, and patient responsibility. But let’s just look at the top three.

#3. The Japanese health system is based upon universal healthcare, which is backed by mandatory participation through payroll taxes by both employer and employee. These are income based for self-employed. Add to this long-term care insurance required to be carried by citizens over 40. Japan has controlled cost by setting flat rates for everything from checkups to surgery to medications. This has the obvious effect of removing competition among insurance companies. And while most hospitals are privately owned, as in the United States, there are smart regulations to ensure they are focused primarily on egalitarianism.

#2. Singapore has an even better health care system, largely funded by individual contributions and based on preventative medicine, public health, and individual responsibility. Individuals are required to contribute a percentage of their salary based on age, although the government does provide a safety net to cover expenses beyond those that the individual person may be able to afford. While private healthcare plays a significant role in the Singapore system, it is secondary to public health and public hospitals which employ the vast majority of doctors and other health professionals.

#1. Hong Kong, at least currently, has the best health care in the world as well as arguably the free-est economy. Hong Kong manages this by offering universal health care, which includes significant government participation. Hong Kong’s health secretary calls public health the cornerstone of their system, and public hospitals account for 90% of all inpatient procedures. There are private hospitals in Hong Kong; however, these are generally used by the wealthy.

Currently at least, the U.S. does provide the best medical training in the world; however, our healthcare, although the most costly. is far from the best. For population health, infant mortality, maternal health, longevity, and chronic disease, we are somewhere near 20th place. What can we do to improve healthcare in America?  That’s a bit complicated and will take some explaining. Warning: Personal Opinions ahead. 

Step 1: Provide Basic Health Maintenance to all Americans.

The adage “An ounce of prevention is worth a pound of cure” is no truer than in health care. Public Health and Preventative Medicine has lengthened the lifespan by decades, reduced fatalities in traffic accidents and accidental poisoning, and in combating environmental disease. Yet the funding for preventive medicine and public health is infinitesimal compared to the overall healthcare budget. While Preventive Medicine has been a specialty for over half a century, it comprises only 0.8% of the physician workforce. There is an inadequate focus on prevention in medical school curriculum, and physician training in preventive medicine as a specialty is an economic catastrophe due to the lack of funding. When you combine this with the profit-driven treatment ideology of American medicine, and the very small amount of financing the US Federal Government Health Resources Service Administration funds for preventive medicine training programs, we end up with a treatment, rather than prevention, focus. Treatment focused medicine is very advantageous for hospital administrators, highly profitable for pharmaceutical companies, yet less so for the folks who actually treat the illnesses, and of course, often financially devastating for those unlucky enough to find themselves on the receiving end. If we know that prevention is the best method to reduce healthcare costs, morbidity and mortality, why not create a single payer basic coverage for Americans? It has worked in every other first world nation, why not here? (To guarantee success, do not repeat what bureaucracy and administration-heavy ideologies have done to the American Public Education system).

Step 2. End malpractice insurance requirements and introduce Tort Reform.

The claims have been that without the right to file suit, patients are vulnerable to bad doctors and bad procedures. Yet the cost of frivolous suits, often paid rather than fighting due to the cost of lengthy legal battles, and the resulting increase in medical care due to high premiums for liability insurance have a detrimental effect on health care providers. Combine this with the realization that America is a very litigious society, and you have a recipe for costly healthcare. There can be no question that malpractice lawsuits have drained the system for decades. Americans once accepted that there are certain involved risks in every serious medical procedure, and when things went wrong they did not immediately look for an attorney and someone to blame. This is especially ridiculous in a culture that places so little emphasis on preventative medicine and public health. We eat what we want with little regard to how healthy it is, we are sedentary, and the majority of us do not get regular exercise. And should we have complications from open-heart surgery due to life choices, we immediately look for someone to blame. Make no mistake; things do go wrong in some medical procedures, these are called risky for very good reasons. Yet today’s OB/GYN’s are paying $100,000 per year for insurance, often to protect themselves from grieving parents and emotional juries seeking to hold them responsible for outcomes that no one could possibly have prevented. I am not suggesting that people with a legitimate right to sue a physician for screwing up badly should be eliminated, only that some system of reform is desperately needed. Going hand-in-hand with legal reform, the need for huge malpractice coverage drives up the cost of healthcare.

Here’s an idea: during the 1980’s, one could purchase “Accidental Death” insurance before flying on a commercial airline. Why not make this a part of medical care? No need for costly malpractice insurance for the annual check up; however, if you have to undergo a risky procedure or an operation, why not purchase coverage for that event only? The policy would be temporary and would cover the outcomes of the procedure only. There is little need for the average family practitioner that seldom wields a scalpel or ventures near an operating theater to pay out tens of thousands of dollars for liability insurance. Of course, the insurance companies are quite happy to take the money.

Step 3. Make people responsible for their own preventative health.

Benefits of a healthy diet and even a modicum of exercise result in a moderate reduction in the risk of stroke, cardiovascular disease, diabetes, certain cancers, and coronary artery disease. These five maladies alone account for 74 percent of the total fatalities attributable to the U.S.’s top 10 leading causes of death according to a report by the Centers for Disease Control and Prevention. Going along with providing basic coverage for things like childhood vaccines, wellness checkups, and annual physicals, making people responsible for their own behavior may have fallen out of favor in America today, but it makes sense, both physiologically and financially. If patients refuse to take responsibility for their healthcare, for example, refusing to participate in prevention or wellness check ups, they could be dropped from the program, thus providing significant motivation to maintain their own health. Ultimately, you and you alone are responsible for your health care. We must remember that here in the U.S., hospitals are duty-bound to treat emergency cases, and government spending pays for a surprisingly large share of visits to medical practitioner and treatments through a patchwork of public programs. These include Medicare for the old, Medicaid for the poor, and still other programs for kids. This being said, there are no incentives to try to remain healthy, no responsibility expected from patients, and any suggestion on diet or behavioral changes are often met with derision.

Step 4. Put Healthcare experts in charge of healthcare.

As Americans we rail against the administrative heavy bureaucracy of politics, and rightly so, but why then do we accept it in healthcare? Currently the experts state that physicians account for roughly 8 to 9% of healthcare costs, yet ultimately they shoulder the responsibility of accurate diagnosis and treatment. The same bureaucratic take-over that has diminished public schools in America (when districts once got by with two principles and a superintendent for 1,000 students, the same student population today may have two superintendents, three assistant superintendents, four principles, and four assistant principals as well as two educational directors, for example), is currently running wild in American medicine. A recent study for the State of California broke down the Healthcare Dollar this way: 31 cents went to hospitals, 22 cents to physicians, nurses, and treatment staff, a dime went to pay for prescription drugs, another 10 cents to dental care, 9 cents to nursing homes, 7 cents to administrative costs and 10 cents to items like medical equipment and hospital construction. The two biggest gainers were prescription drugs (up 15 percent) and administration (up 16 percent).

Step 5. Introduce product competition.

This goes hand-in-hand with making patients responsible for their own healthcare. Instead of requiring standard co-pays, make patients responsible for seeking the most cost-effective treatment. Changing from a copay system, for example $25.00 regardless of the doctor you visit to a percentage, for example 20%, would reduce healthcare costs. If a visit to Dr. A were billed at $200, the patient would pay $40.00. However, if Dr. B offers the same service at $125, the fee would be $25.00. This is how many patients purchase prescription medications; why not create a completion among service providers to reduce costs? Currently, if the doctor you see charges $300 per visit, and you pay a copay of $20.00 regardless, why do you care how much they charge? Lets introduce some competition. While even the best (so-called Cadillac) insurance policies have huge deductibles, few can escape the often crushing weight of medical related bills due a combination of bureaucracy, risk-averse medical practices based on the fear of a litigious culture, and a refusal of average people to take responsibility for their own health.

So how come we cannot just model a healthcare system after, say Japan or Denmark?

Well, it’s not that easy. A big part of it is cultural. Take teen pregnancy for example. Americans have the highest rate in the world, and in our culture we have decided to reward it (and even have reality television shows about it); in other cultures, for example Japan or Hong Kong, it’s seen as something to be avoided at all cost. In Denmark and Norway, people do not have to be forced to practice preventative medicine; they seek the information. Finally, bad behavior and the negative health impact that goes along with it is often rewarded or at the least, excused in our culture. So long as American medicine remains a profit-driven rather than care-led culture, where trivial health-related lawsuits are put before vindictive, emotional juries, and where public health and prevention are not incentivized, we will continue to have emergency departments clogged with chronic conditions, mental illness, and preventable illness. Through a combination of under insurance for most, over insurance for a few, and a disregard for public health and prevention by most, the citizens of the U.S. will continue to suffer the highest healthcare costs in the developed world, combined with some of the worst health outcomes, particularly if you are poor, female, or a child.

In the end, simply doing things the same way and hoping for improvement is ineffective at best, and dangerous at worse. Let’s get medical experts back in charge of medicine, reign in profit-crazed insurance companies, and our litigious population, and accept that maybe, just maybe, we are responsible for how crappy, sick, and unhappy we are.

Surfing the Web with a Cave-Man Brain, or Art Appreciation, 40,000 BCE.

cave-walls

Paleoanthropologists and paleogeneticists tell us that our species Homo Sapiens emerged pretty much as we appear today, sometime before 100,000 years ago in East Africa. In that time, most of the evolving we have done is sociological rather than biological. In other words, if we took a Homo Sapiens from 95,000 years ago, gave him a bath, a haircut, and put him in a suit, he would be indistinguishable from any other man in America. He would probably not be so overweight, but as a rule, he would blend in.His physiology and neurology have not changed all that much. This is why I believe that the things that were most important 100,000 years ago are still most important to us today.

The needs and desires of our ancestors are still very much alive today. Despite our belief in our advanced intelligence, we are still operating off very old software: a brain of a cave-man. Some social scientists have described this principle as the outcome whenever a modern human has to choose between modern technologies or their primitive desires. The desires, they claim, win every time. They clearly have a point. What was once considered proof of prowess and power, the kill, was visible evidence. No one cares about the one that got away. If I cannot touch it, it doesn’t exist.

Could this be why modern humans often insist on printing emails? To have that physical item, that proof in our hands. It’s as if having the information on a computer screen is not real. We demand proof. Remember the 1990’s claim that soon we would all be working in paperless offices where files would be kept electronically? Never happened, never will. Our caveman brains demand proof. Not an image that disappeared when the power was switched off, or that we could not show to our friends to both impress and attract them (imagine showing up on your anniversary with a photo of flowers or the chemical list of the ingredients in Chocolate)

Our ancestors preferred face-to-face contact. They wanted to see, to smell, and to touch the other person. This was the basis of social bonding for thousands of years. People are far more real in person. In an electronic community, where faceless “friends” can argue, fight, and demean one another is simple when you are not face-to-face with the other person. We need that social intimacy to size each other up. So much communication takes place visually, unspoken, that simply cannot happen on an online social networking site.

As social mammals, we feel a deep need to watch each other up close. This allows us to read the roughly 65% of communication that is non-verbal. Their body posture, stance, movement, facial expressions tell us a great deal about what is going on. This is not possible with the faceless chat room. By close social connection, we make that common bond that allows us to convey information, seek support, or gain acceptance; something we strive for regardless of what we may say to the contrary. Before the development of language, body posture, facial expression, and movement passed on to others a great deal of information including emotions like anger, jealousy, and fear. Our caveman brain is still wired for this, yet this is impossible in the online world.

Being there with others, making the physical connection, being present is still a critical part of who we are as a species. Friends still prefer the real connection to online social websites, still prefer watching movies together, rather than alone. All of this seems to fly in the face of modern technology, and especially modern advertising.

In the age of the Internet, social media, tweets, messengers and alerts, we are more separated and alone than ever. And yet we still prefer to see things, experience things, as our ancestors did; face-to-face. It may have been crucial for survival in the forests and jungles to rely on what we can see for ourselves, what we can experience first-hand. Let us remember that we are all descended from predators who hunted, chased, stalked, and captured prey long before settled agriculture developed grain crops. But without huge teeth or sharp claws, we were forced to be wary. We became watchers. Today we still enjoy watching others and even sit for hours in front of a TV, endlessly watching other humans, and we seem to enjoy this far more if we are watching this media with others.

At the same time that our ancestors watched for opportunities to prey on other species, we were not at the top of the food chain. Perhaps this is why we get very nervous when we feel or see others watching us. We get nervous watched, even by those we know. Scientists claim that after being watched for just a few seconds, most of us will become irate and may act hostile toward those watching us, especially if that person is a stranger. For this reason, video chatting software such as Skype or FaceTime was slow to become popular. Even today, with years of positive marketing by social media companies with attractive celebrities, video conferencing is slow to catching on, and the vast majority of us still prefer to do our on-line socializing by text, despite our need for human contact.

The competition between the Internet and the interpersonal continues; staring at a monitor or spending time face-to-face. At the end of the day, few of us will choose to spend time with a computer rather than with others, family or friends. Live theater will always exist despite television, concerts despite music media, and tourism despite the Internet and virtual reality goggles.

Today, our Cave-Man Brain tries its best to adapt to both worlds, the real and the virtual, but given a choice, like our cave-man ancestors, we will choose interpersonal, face-to-face contact nearly every time. So what are we to make of the seeming popularity of social media, social networking, and chat rooms? If we consider that those of our ancestors who managed to attract and maintain large social networks could rely on others for resources, advice, help, and protection, all necessary for survival.

While the Internet may have been created for the rapid sharing of secure data among scientists, it has evolved into an entertainment colossus. This should come as no surprise, as the paintings found on cave walls in many parts of Europe and Asia from over 40,000 years ago illustrate that we entertained ourselves after the hunt with stories and shared accounts of the events of the day. After the hunt, our ancestors probably entertained others with accounts of the hunt. Socially, this shared experience is crucial to establish connections, for bonding with others, and for establishing the storyteller’s place in the group.

The creation of art not only entertained, it informed, and as a result, played an important part in the neurological development of our brain, which seems to process most information symbolically. Our language is highly symbolic, and in fact, our earliest written languages were highly symbolic, and today science and mathematics rely of symbolism to convey often-complex ideas. Piaget, a noted child development researcher, found that children develop symbolic language and function to master the ability to picture, remember, understand, and replicate objects in their minds that are not immediately in front of them. In other words, children can create mental images of objects and store them in their minds for later use. Just like the cave paintings of 40,000 years ago. And although you can view these wall paintings on the internet, they are far more impressive seen, and touched, in person. But that is just the way our cave-man brain works.

A Bit of Information..

Recently, a clinical epidemiologist about to take a field position in Africa asked where they might find a listing or a graphic of some sort that captures the most common parasitic conditions they could be asked to deal with. I did a quick search and, not finding anything, decided to create one myself. This is NOT all inclusive, however, I think this will cover those he is most likely to deal with.

the-most-common-parasitical-diseases-and-conditions

Dogma and the Triumph of Science in Medicine. At least for now.

breathing-a-vein

Medicine has in its history a most frivolous period, wherein virtually anything that could be dreamed of for the treatment of disease was tried, and once fixated on as a cure, would be held fast for decades or even centuries before being given up as useless at best, and detrimental in general, most often based on nothing more than trial and error. A great deal of human suffering came at the hand of these medical practitioners as their belief in a practice as curative did not require the patient actually improve. During this time medical treatment caused far more harm than good, as a belief in Humors or Bad Air was considered the cause of most illnesses, and doctors did not sterilize their hands or instruments.

At one time, bleeding, cupping, purging, and fasting were all considered the pinnacle of medical science. The medicines prescribed for most any ailment were worse than the practices. These included the infusion of concoctions made from plants, ingesting the solutions of every imaginable combination of metals, diets restricted to one thing or another, often based on nothing more than the misunderstood imaginings of the causes of diseases. Medicine before the turn of the 20th century was not far removed from the same dogma as religion, concocted from the minds of so called scientists based on whims and opinions.

In retrospect it is amazing that the practice of medicine got away with causing so much death for so many centuries with so little public outcry. But don’t look now, unscientific medicine has been making inroads in the least likely place on the planet: the United States. In America today some practitioners I once again decrying science and opting for a simpler more down to earth treatment for physical and psychological ailments. I guess they miss the good old days.

During the 18th-century in America medicines were often made from botanicals. Most physician’s offices and druggists used a catalog of herbs that explained how they worked, where they grew, and what they cured. Of course during this time it was also thought that submerging patients in ice baths was an effective way of treating many elements including fever from influenza. Ice baths were also considered effective treatments for convulsions, plagues, typhoid fever, insanity, and of course got goodness. I suppose once the patient went into shock, they could be said to have been cured.

Another unscientific and I would argue dangerous belief system making a comeback is homeopathy. Homeopathy is the belief that treating them with something can cure a sick person that would give a healthy person the same symptoms. The idea, first proposed by the Samuel Hahnemann in Germany around 1796, his ideas quickly spread across much of Europe. Of course, every remedy was fashioned from something organic, said to of come from the earth. Practitioners of homeopathy do not believe in vaccinations or antibiotics, and instead use greatly diluted substances. Oddly, homeopath believe that the more a substance is diluted the stronger the potency or power of the treatment. If this sounds counter productive, that’s because it is.

Most scientists will agree that homeopathy relies heavily on the placebo effect. How else could a person being told that one part per billion, less than can be even perceived has curative powers? Purely psychosomatic.

It’s not simply a desire for a simpler time that has gripped the peripheries of modern medicine. The most significant improvements in the 18th century came from the scientific process of understanding disease, not as a product of evil, nor of night air, or as some imbalance of the four humors (Yellow Bile, Black Bile, Blood, and Phlegm) that had been first imagine by the ancient Greeks, but by some change, either internal or external to the balance of the body that interfered with normal functioning. Perhaps the most significant improvements to human health came from public health, which drove vast improvements and sanitation and hygiene during the 18th through the 20th centuries, along with vaccinations against childhood illnesses that once caused the deaths of tens of millions.

The use of vaccinations which began in the 18 century to battle smallpox, a disfiguring and often fatal disease that had become an epidemic in both North America and many countries in Europe. Another major disease, scurvy, often faced by sailors was found through scientific investigation to be caused by an adequate vitamin C. Treatment was simple: the use of fruits and vegetables to improve health (the vitamin C was not diluted one part per billion). What seems to be clear is that due in large part to the science illiteracy in America, the belief in what can only be deemed as a magical belief in centuries-old curatives.

The long habit of medicine prior to the 19th century was to treat every ailment with something, anything, often to the detriment of the patient. A treatment for sores, Cow Dung, was used for centuries despite not working, and often making the condition significantly worse. It seems not to have occurred to these men that some conditions, if left untreated, simply improved over time. Unless of course the patient sought medical treatment. Dr. Edward H. Clark of Harvard in 1876 espoused that patients with typhus and typhoid sometimes got better without treatment and certainly would have been the worse for the bizarre fermentations of herbs, the infusions of heavy metals, or other “treatments” that were popular just a few decades ago.

The art of medicine replaced what was considered the science of medicine; in reality the scientific approach to medicine came to replace the magic and belief of medicine.

And yet today we find ourselves once again happily and absent mindedly gazing back down the path we have, as a species trodden, from the dark ages of medicine and thinking how much better it used to be. If ignorance is truly bliss, it is an unhealthy bliss wrapped in the unscientific dogma of faith. Pass the Holy Water please. One drop per billion, thank you.