My Journey with Food, Part 2

In my second year of college, I was eating primarily out of my dorm room, cooking brown rice, tempeh, and other macrobiotic foods with an electric steamer and hot pot. I was no longer depressed, though I was frequently hungry and had to eat heaping platefuls of macrobiotic foods just to feel full. My dependence on junk food was gone, but at the same time I was hesitant to branch out to any foods I wasn’t already familiar with.

Then I met Katy, the woman who would become my wife. She was a year behind me in college, but we had a common interest in swing dancing and were both considering medical school. As we spent more time together, I automatically assumed she would find my food (and me) weird. To my surprise, she didn’t think of me as an organic freak.  Though Katy grew up eating typical American food, the majority of it was home-cooked by her mom. She also had friends in her rural hometown who ate organic, home-grown whole foods, which tasted better than just about anything else she had tried.

The real difference between Katy and I was vegetarianism. I objected to eating meat, and she didn’t. It had the potential to be a serious obstacle to our relationship. But on our first dinner date at an Indian restaurant, Katy offered to let me order for her. I explained that I didn’t know much about meat. “I don’t have to have meat,” she said. “I’ll get what you like.” Not only that, but when Katy learned about my allergies, she applied her cooking and baking skill to making the meals she loved with soymilk or Earth Balance instead of milk and butter, so that I could have them too. Katy’s grace in meeting me more than halfway made me much more willing to try foods she liked that I had been picky about for years

Katy and I both lived off campus during our last two years of college, and did all our own cooking. We gradually ate a greater variety of healthy foods and saw corresponding improvements in our health. In my senior year, I was having a conversation with a friend about diet and was blathering on about the immune system. He promptly suggested I become an immunologist after graduation. Although I had considered becoming a doctor, the idea ultimately did not appeal to me because I felt it would entail treating the symptoms of health problems, rather than the causes. My friend’s suggestion, however, made me realize that advising people on diet and lifestyle would be an effective way to promote health.

The question was where to acquire my education in nutrition. I knew enough of certified nutritionist and dietician programs to turn them down. I didn’t want to tell people to count calories, take supplements, eat artificial sweeteners instead of sugar, and drink more milk “for strong bones.” From my own experience and from the reading I had done, I was aware that nutrition science was not always science-based, and was rarely effective in motivating people to get healthier. After a good deal of research, I located a school called the Institute for Integrative Nutrition, which taught all the major dietary theories, but had a core emphasis on whole foods, traditional diets, stress reduction, and counseling people on overcoming difficult emotional relationships with food.

At IIN, I learned that the Macrobiotic diet on which I had grown up was so effective because of its ancient roots as a traditional Japanese diet, evolving over thousands of years to meet the nutritional needs of the people who lived on it. In fact, pretty much all the solid dietary wisdom we received in school was based on what people ate traditionally, though the specifics of a good diet differed from climate to climate. What remained in common, however, was the principle of eating whole natural foods, in season, and in the right proportions.

To my surprise, I learned that for my body type, some meat in the diet might be necessary for health. Since I was in school in New York and Katy was still in college, I decided to try making a hamburger (grass-fed, free-range) myself. She warned me against it. “Just wait until I visit you, and I’ll make it for you. At least, if you have to make it, make sure it’s not gray in the middle.” I made the hamburger, and as I ate it, wondered what the big deal was all about, and why so many people loved red meat. Of course, the hamburger was gray in the middle – I had been distracted during that part of the conversation with Katy. Eventually she showed me how to cook it correctly, and I started adding more meat and fat to my diet. For the first time in my life, I felt full on a regular basis, and I noticed that the sugar cravings that had plagued me on and off throughout my life were gone.  When Katy’s mom, who had never been entirely comfortable with my vegetarianism, learned that I had gone to a hippie nutrition school and learned to eat meat, she became willing to eat kale on a regular basis (and now likes it).

At IIN, I also learned for the first time about traditional raw milk from grass-fed cows and its greater digestibility when compared with pasteurized milk. Due to my allergy history, however, it was another year until I found myself willing to try it. Since then, I’ve included raw milk in my diet on a regular basis with nary an allergic reaction. Recently, Katy mastered the art of traditional whole wheat sourdough bread, and I’ve been able to eat it as well without a problem.

Nowadays, when people ask me if I have any dietary restrictions, I say “none.”  I’ve gone from someone who always felt like the pickiest eater in the world to someone who is willing to eat anything. It’s not that I think everything is healthy, or right, to eat, but if I want to guide others in dietary matters, I have to be open to trying their food as well, just as my wife was for me on our first date.  While I’ll never be able to eat junk food like I did in college, and still be healthy, the important thing is that I don’t want to. Thanks to my education, I’ve learned how to eat a healthy, balanced diet that meets all my nutritional needs and satisfies my cravings. It’s an area of my life that is no longer a source of stress, nor is it putting me at risk for illness. And while not everyone might thrive on the exact same balance of whole foods that is suited for me, every person is capable of achieving the same type of success with diet and health. What I love about my work as a holistic health counselor is the opportunity to guide others into that place, and to see the amazing and long-lasting improvements in their health that result.

 

My Journey With Food, Part 1

As a holistic health counselor, I regularly give people advice on how to eat and how to develop a positive relationship with food. But my own relationship with food was once very difficult. When I was just a few years old, my parents discovered that I was strongly allergic to wheat and dairy products, and mildly allergic to citrus fruits and nuts. But instead of getting a rash or a runny nose, I would have an emotional breakdown and go into hysterics after eating these foods. Only when the foods were out of my system would I again recover my emotional balance.

Partly to avoid these allergens, my family followed the Macrobiotic diet, which was based on whole, organic foods, particularly traditional Japanese foods. As a result of the diet, we enjoyed good health and energy and rarely got sick.  However, I did have occasional sugar cravings, as well as cravings for the foods to which I was allergic. I also grew up an excessively picky eater. From childhood, I was used to brown rice, miso soup, sea vegetables and greens, and was apprehensive about trying foods outside my macrobiotic “comfort zone.” I dreaded having to eat at friends’ houses or at non-Japanese restaurants where I might be served something I didn’t like. My pickiness, combined with my allergies and my decision to be a vegetarian, meant that finding food I could or would eat was always a stressful situation for me and my family.

During my teenage years, my family stopped following the macrobiotic diet as strictly as before.  Although I wasn’t exactly a “junk food vegetarian”, I didn’t eat as many balanced meals as I had in the past. I liked to snack on rye bread with margarine, trail mix (as my nut allergy had diminished), and corn chips, and I didn’t eat many vegetables. Every once in a while, we had a big macrobiotic dinner that helped keep my health on track, but I didn’t make the connection, instead taking my good health for granted. In fact, when it came time to go off to college, I thought to myself that I would be able to get by on trail mix, energy bars and soy milk, without suffering any health problems. I didn’t even think eating the foods to which I was allergic would be such a big deal.

Unsurprisingly, the campus cafeteria had almost no appetizing vegetarian, non-dairy options. I was constantly hungry, and gravitated towards sugary foods like cookies and candy, which was embarrassing, as all my friends knew I came from a “health food” background.  But I didn’t think very seriously about the consequences of eating so much processed food, and didn’t expect anything bad would come of it. In the meantime, I enjoyed eating my junk food far better than the poorly prepared whole grains, beans, and vegetables in the cafeteria.

Everything changed, however, over the course of one Sunday in my second semester of freshman year. I was enjoying college in general and had been having a particularly good week. But in the midst of a normal conversation with my friends after breakfast, I began to feel an overwhelming sense of despair. I had no idea where it was coming from, but it got worse over the course of the day. I hoped inwardly that a good night’s sleep would banish it. I recognized this strange feeling as one that had come over me during the last few days of my first semester of college, just before I went home for a few weeks. At that time, it had not been as strong, and right after it happened I had benefited from a lot of macrobiotic home cooking. This time, however, my depression did not go away overnight, over the weekend, or even during the next week – it just got worse. There was nothing going on in my life to be depressed about, but I couldn’t shake the feeling regardless. My life – all of reality, in fact – felt empty and meaningless, and I felt terribly sad, but for no good reason. No breakups, no deaths in the family, no financial worries, no legal issues. College was hard work, but I had been relishing the challenge.

Without a macrobiotic background, I might have just chalked my inexplicable depression up to a chemical imbalance in my brain and gone to a doctor or psychiatrist for mood-altering medications. But instead I called my parents and told them what was happening. They immediately recognized the symptoms of my allergies, and I acknowledged that I had been eating lots of sugar, white flour, and dairy products in candy and baked goods, while almost completely avoiding vegetables.

Although it didn’t take away my severe depression by itself, my parents’ theory sounded plausible to me. I didn’t feel like taking care of myself, but nevertheless I forced myself to put the effort into eating differently in the hope that it would take away the horrible emotions I was experiencing. I bought a rice cooker and vegetable steamer to go with my electric hot pot, and started making macriobiotic lunches and dinners in my dorm room. Within the next few weeks I gradually began to feel better, but remained anxious that the improvement was only temporary. In the end it took several months of eating right and avoiding junk food entirely for my depression to fade away. In the next semester I borrowed some macrobiotic books and began teaching myself to cook some basic meals. Having seen the effects on myself of healing foods in action, I became fascinated with whole foods, their benefits for various health problems, their traditional usage, and how to prepare them. Mainly, I realized how much I didn’t know about nutrition and health – and how many foods I had never even tried. Even despite my lifelong allergies, I had devalued and ignored healthy food, the very thing that made it possible for me to function, and I clearly had a lot to learn.

 

To be continued next week!


Vaccination, an Overview, Part 5: Two Approaches to Vaccination

The lack of knowledge regarding the primary cause of the epidemics of disease in the previous centuries – that is, malnutrition and exhaustion leading to weak health – can be attributed to the medical establishment’s general emphasis on the symptoms, rather than the causes, of illness. This emphasis is not a feature solely of modern medicine or even of mainstream medicine. Throughout history, both mainstream doctors and alternative health practitioners have sought to provide us with “quick fixes” and convenient solutions to the symptoms of problems initially caused by poor diet and lifestyle, rather than recommending a diet and lifestyle that would help prevent such problems in the first place.

The first flaw in this approach is that the medical interventions intended to eliminate our symptoms invariably have significant side effects of their own, which then require further interventions with additional side effects. A second flaw is that in situations where such interventions are not available, if we have not been taught how to take care of our health, we are entirely at the mercy of disease.

            We can think of suffering a severe reaction to an infectious disease as a “symptom” of being in poor health in the first place. However, the medical establishment, by minimizing the role played by a healthy immune system, implies that we are all equally defenseless against pathogens. The consequences of this approach are the ever-increasing vaccine schedule and also our society’s “germophobia.” In fairness, in recent decades, acknowledgement of the importance of diet and lifestyle on healthy immune function, and of the abilities of the immune system itself, has increased. But when flu season strikes, the message we hear is primarily to get vaccinated and to minimize the spreading of germs. Just as is frequently the case with diet, we try to eliminate what’s bad but we don’t try to replace it with what’s good.

In taking the vaccinate-and-sanitize approach to fighting disease, rather than an approach that promotes health, we may simply have shifted the battleground. Part 3 already discussed the real and potential debilitating side effects of the vaccines we administer to our children and ourselves.  As for sanitization, while it can eliminate potentially harmful germs from the environment, it often does so by means of toxic chemicals. And lack of exposure to germs is likely to result in oversensitive immune systems that will react negatively to pollen and commonly eaten foods.

In accordance with the second flaw of the symptom-focused approach, children are left vulnerable in instances where vaccines do not succeed in providing immunity, where they are unavailable, or where they have not been invented. And when unhealthy children enter  germ-laden environments, as is inevitable given that they spend more time in the doctor’s office or hospital, they are at serious risk. Consider that in the United States an estimated 80,000 individuals per year die from infections received while in hospitals, most frequently due to catheters, which is not unsurprising given that hospital workers only wash their hands about 70% of the time. While our general environment, and our hospitals as well, are far more hygienic now than they were 150 years ago, and we are generally better nourished as well, we still have a long way to go.

As a short-term strategy, vaccinate-and-sanitize saves lives. But like most approaches that only seek to address the symptoms, not the causes of our health problems, it inevitably results in new, mysterious health problems. At best, focusing on vaccines largely maintains the status quo.  The Bill & Melinda Gates foundation is an excellent example. The world’s largest private foundation, it possesses an endowment of $33 billion, and devotes a significant portion of its resources to improving global health. Vaccination and medication programs in third-world countries are the primary beneficiaries of these resources. However, after decades of giving, vaccine-preventable and other diseases still persist in these countries, and the individuals receiving vaccines and medications often lack basic needs such as nutrition, clean water, and transportation. If they can even access the medications, they may not have enough food to digest them. The wealth of the foundation is ultimately directed to the already wealthy pharmaceutical countries, while the residents of third world countries remain malnourished and impoverished.

In the long run, symptom-focused strategies tend to benefit more those who promote them than those who are subject to them.

A better way to handle the threat of infectious disease would be to create the conditions for healthy, strong immune systems in children.  As discussed in Part 4, these conditions include eating a diet based on whole foods (and breast milk in the case of infants); drinking clean water as the primary beverage; getting enough rest and enough exercise; and reducing stress. Like a muscle, the immune system must also be exercised in order to be strong.  Natural “vaccination,” or technically, immunization, can occur when we are exposed from a young age to a wide variety of microbes in raw and fermented foods, in breast milk, even in dirt. It is a process not too different from that by which Benjamin Jesty’s dairy workers were naturally protected from smallpox.  In fact, a healthy child’s natural exposure to more mild infectious diseases, such as chickenpox, may be beneficial for healthy immune development. A well-nourished child with a well-exercised immune system is strongly equipped to handle the pathogens he or she is likely to encounter, and is likely to be one of the vast majority of children who do not suffer severe reactions to more serious diseases such as measles, or even polio or diphtheria.  A child who, in contrast, is raised on a diet of largely processed food, with little exposure to beneficial bacteria, has a sedentary lifestyle, and suffers from frequent colds and ear infections which are treated with antibiotics instead of fought by the immune system, is more likely to have a severe reaction to a strong pathogen and, in the absence of a change in diet and lifestyle, would probably benefit from the protection of most vaccines, despite their potential side effects.

Unfortunately, the well-nourished child is far more rarely seen in our society than his or her conventionally-nourished counterpart. The medical establishment has, for the most part, chosen neither to study nor promote practices that make us more healthy and consequently less dependent upon vaccines, medications, supplements, sanitizers, and surgery.  Most doctors acknowledge that whole foods are better than processed foods, breast milk superior to infant formula, and some exercise better than none. Some may even recommend playing in the dirt over sitting inside all day. But junk foods continue to infiltrate our schools just as recess programs disappear from them. Infant formula is pushed on women whose babies are not growing at rates arbitrarily determined to be acceptable. Many people consider it inconvenient to try and be healthy or to allow the immune system to fight disease, which is the primary reason why a chickenpox vaccine, for example, was invented. As long as our attitude towards health is symptom-focused, and values short-term convenience over long-term wellness, our health will remain vulnerable and our need for medical interventions will grow.

In the late 19th century, during the worst epidemics of disease, the author and social theorist Leo Tolstoy wrote, in an essay entitled Modern Science:  “The defenders of present-day science seem to think that the curing of one child from diphtheria, among those Russian children of whom 50% (and even 80% in the Foundling Hospitals) die as a regular thing apart from diphtheria must convince anyone of the beneficence of science in general…our life is so arranged that from bad food, excessive and harmful work, bad dwellings and clothes, or from want, not children only, but a majority of people, die before they have lived half the years that should be theirs…And, in proof of the fruitfulness of science, we are told that it cures one in a thousand of the sick, who are sick only because science has neglected its proper business.” According to Tolstoy, though science was an incredibly valuable tool, when misdirected it did not benefit humanity. To be beneficial to us, medical science must be guided by wisdom and foresight, rather than shortsightedness, and should possess a healthy respect for and inquisitiveness into the capacities of the healthy human body.

The ultimate question of whether and how to vaccinate your child is a difficult one, and the answer is not the same for everyone. There is no utterly risk-free approach to take; even the healthiest person can still succumb to a powerful pathogen, as can the most thoroughly vaccinated person. Your decision must involve an awareness of your child’s likely susceptibility to each disease against which we vaccinate, and a calculation of the benefits of the vaccine against the risks of its possible side effects. Regardless of your decision, however, the best thing you can do for your children is to make the necessary changes in your diet and lifestyle for the promotion of health. For further reading on the risks and benefits of vaccines as well as strategies for strengthening the immune system, I suggest you consult one or more of the books listed below.

 

Vaccinations: A Thoughtful Parent’s Guide by Aviva Jill Romm (2001). Discusses vaccines from a historical perspective and contains natural and herbal remedies for common childhood diseases as well as recommendations for building immunity naturally. Romm is a certified professional midwife, a practicing herbalist, and a physician.

 

What Your Doctor May Not Tell You About Children’s Vaccinations by Stephanie Cave, M.D. (2001). Explores the possible relationships between vaccines and autoimmune diseases/developmental disorders, and contains an overview of vaccines and the legal issues related to them, as well as an alternative vaccine schedule.

 

The Vaccine Book by Robert Sears, M.D. (2007). Contains a detailed guide to the current vaccine schedule, including a discussion of the severity and rarity of each disease and the ingredients and side effects of each vaccine. Also contains an alternative vaccine schedule.

 

The Vaccine Guide by Randall Neustaedter, O.M.D. (1996, 2002). Provides an extensive and technical overview of research on the safety of vaccines and the results of that research.

 

How to Raise a Healthy Child in Spite of Your Doctor by Robert Mendelsohn, M.D. (1984). Covers the most common childhood ailments and the appropriate treatments for them. Also contains a section on diseases commonly vaccinated against, their severity, and the effectiveness of the vaccines.

 

Vaccination, an Overview, Part 4 Building Immunity

As discussed in Part 1, vaccines are designed to stimulate the immune system. In fact, their effectiveness really comes from triggering the body’s own natural processes of adaptive immunity. The underlying assumption of vaccination is that the immune system is inherently not likely to be strong enough to handle a disease when it encounters it in nature, which is why we need vaccines to safely and artificially engineer the encounter with disease. This assumption, which has its origin in the aforementioned germ theory of disease, is perhaps an understandable one. In the late 19th century, scientists had observed epidemic after epidemic of infectious disease resulting in millions of casualties. It was reasonable in them to consider that the pathogens they had discovered were indiscriminately deadly. However, one scientist of the era, a French biologist named Antoine Bechamp, had a different proposition for why people were succumbing to infectious disease at great rates: he pinned it on their weak health.

Bechamp, a contemporary of and influence upon Pasteur, would have agreed with Pasteur’s arguments that methods of sanitization (such as hand-washing and pasteurization) would prevent the spread of disease by eliminating pathogens from the local environment. However, Bechamp’s theory was that most people who suffered from infectious disease did so because their own bodies were, in a sense, “unsanitized,” but on a cellular level. According to Bechamp, when we are in a diminished state of health, our cells and tissues form a breeding ground for microorganisms (or microzymas as he called them) that are largely already present in our bodies, but which do not take on a harmful form or reach harmful levels without the supportive environment provided by an sick individual.  Bechamp’s theory formed a contrast to an interpretation of the germ theory that identified external pathogens as being the sole and direct cause of infectious disease, regardless of the prior health of the diseased person.

Meanwhile, Robert Koch, a contemporary of Bechamp and Pasteur, had formulated four postulates meant to establish a causal relationship between a unique pathogen and a unique disease. The postulates specified the following: (1) the pathogen should be in all organisms suffering from the disease, but not in healthy individuals; (2) it must be possible to isolate the pathogen and grow it in a pure culture; (3) it must cause the disease when introduced into a healthy organism; and (4) it must then be possible to re-isolate it from the inoculated organism and found identical to the original. Koch later had to change the first postulate after finding cases of healthy, asymptomatic organisms carrying the bacteria that cause cholera and typhoid fever, respectively. He also had to change the third postulate after finding that not all organisms exposed to a pathogen will display symptoms of infection.

Koch’s findings indicated that both Bechamp’s and Pasteur’s theories had some merit. Pasteur’s disease-centered approach, which relied on sterilization, pasteurization, quarantine, and sanitation, was focused on preventing the spread of disease by eliminating the pathogen from the external environment. Bechamp’s health-centered approached was based on making the individual stronger and healthier, and thereby better able to prevent pathogens from gaining a foothold within the environment of the human body. Although the specific mechanism of Bechamp’s theory – that of microzymas arising from our own tissues to form pathogens – has never been proven, scientists have since discovered that our health plays a tremendous role in the effective functioning of our immune systems, and consequently affects how easily we succumb to infections.

The human immune system is a conglomerate of many different body parts and processes. The skin, liver, kidneys, respiratory tract, intestinal flora and more all play a role in destroying pathogens by means of inflammation, white blood cells, and antibacterial or antiviral chemicals and enzymes.  Those pathogens are discharged via the cleansing and flushing action of tears, urine, mucus and diarrhea. The cells that form the adaptive part of the immune system are able to retain memories of specific pathogens and thereby easily neutralize those pathogens with antibodies upon future encounters.

All systems, mechanical or biological, cannot work properly unless supplied with the proper fuel or raw materials.  A car cannot run without fuel, nor an ecosystem without water, oxygen, and sun.  Our immune system is no different; in order to function, it must be providedwith needed nutrients. Vitamin D, the antioxidant vitamins A, C, and E, Vitamin B6, folic acid, and the minerals zinc, copper, iron selenium have all been found to be vital for promoting the health of the immune system, as have the beneficial bacteria contained in raw fermented foods. Other nutrients contained in whole foods that are as yet unstudied or even undiscovered may be similarly essential. Where infants are concerned, breast milk provides, in addition to needed nutrients, a variety of immunologic factors such as immunoglobulins (antibodies), the enzymes lysozyme and lactoferrin, and lymphocytes (white blood cells). These ingredients promote not only the health but also the growth and strengthening of the infant immune system and protects against the routine infections that are much more commonly seen today in babies that are fed formula, which does not contain immunological factors. Along with nutrition, people need a certain amount of rest and sleep, as well as moderate exercise and clean water, in order to maintain healthy immune function. Stress, extreme conditions, exhaustion and dehydration all weaken the immune system, and they also weaken a nursing mother’s ability to provide nourishing milk.

The scientists who were formulating the germ theory of disease in the late 19th century were living during the tail-end of the Industrial Revolution, a period of enormous social, political and technological change. Economies in Europe and America had shifted from an emphasis on rural agriculture to one on urban industry.  In England in 1750, only 15% of people were living in urban areas, but by 1860, that number had risen to 80%.  Within the cities, the lower classes (both children and adults) had started working long hours in factories for little pay, often doing heavy labor in extreme conditions. As a consequence they were frequently exhausted and, as a result of their poverty, malnourished.  The upper classes, on the other hand, deliberately chose to eat newly available refined foods that were low in nutrients and high in calories, and many women did not get enough exercise or sunlight. They too were weak and sickly and prone to death in childbirth. In sum, the majority of people living during this era were in poor health, with low functioning immune systems, and thereby had reduced resistance to disease.

At the same time, the cities to which so many had relocated lacked the proper waste disposal systems for handling such large populations. Consequently, pathogens were able to contaminate the air, water, food and the streets.  Technology had developed in such a way as to ease the transmission of infectious disease without yet possessing a means to  prevent it. Doctors themselves were some of the worst transmitters. Yet unaware of the need to wash their hands (in many cases outright rejecting the idea), they easily spread fatal pathogens to the many patients, particularly mothers in childbirth, whom they treated in busy urban hospitals.  It is no surprise that infectious diseases ran rampant and that infant mortality was around 40% on average, with the highest rates occurring in the cities.

With the formulation of the germ theory of disease, sanitary practices such as Pasteur and the physician Ignaz Semmelweiss proposed were grudgingly accepted by physicians, with positive results. However, the theory ultimately focused much more on the danger of microbes than the ability of the healthy human body to resist them. As a result, scientists and government officials complemented sanitary practices by arguing the need for vaccines, rather than following Bechamp’s lead in promoting a healthy diet and lifestyle that would simply strengthen the immune system.

Fortunately, due to the explosion of nutrient deficiency diseases during the same period of time, vitamins were gradually discovered and added back into the processed foods from which they had recently been removed. This resulted in better nutrition, which, together with advances in sanitation technology, greatly improved overall health and hygiene in Europe and America following the turn of the century, though the conditions for epidemics were still occasionally created by destabilizing events such as World War 1 and the Great Depression. Smallpox, cholera, tuberculosis, diphtheria, scarlet fever, typhoid fever and other infectious diseases all began to decrease, whether vaccines had been developed for them or not. Endemic diseases like measles, mumps, rubella and chickenpox persisted, but were far less likely than before to cause complications or fatalities.

The only disease to cause epidemics in the developed world into the mid-20th century was polio. Polio, like many of the epidemic diseases of the time, had been around for thousands of years without ever being responsible for major epidemics prior to the late 19thcentury. Since 90% of polio infections cause no symptoms at all, deaths and paralysis from polio were rare. All that changed with the onset of the industrial revolution and the weakened health of the population; suddenly, polio could spread easily, and it met with little resistance in its victims. As sanitation began to improve, fewer people were exposed to the polio virus as young children, when they are least likely to suffer harm from it, and when they can acquire long-term immunity. But while the number of people exposed to polio was lessened, the number of those who died or suffered paralysis increased, since those who did not develop immunity as children encountered it as teenagers or adults, when the disease is more severe.

Additional factors in the severity of the polio epidemic were the rising fads of formula feeding and the tonsillectomy procedure. By 1950 over half of all babies were being fed infant formula (lacking polio antibodies, naturally), which was being promoted as better than breast milk by now-discredited scientific studies. It was around this time as well that performing tonsillectomies became a fad among surgeons and doctors, and in the 1930s and 1940s between 1.5 and 2 million tonsillectomies were being performed each year. The tonsils are glands that aid the immune system by blocking pathogens; when they were inflamed, it was a sign they were hard at work. These tissues formed the first line of defense against ingested or inhaled pathogens, such as polio.  Since polio was not as stymied by better sanitation as other diseases, it was able take advantage of the weakened immune systems of older children and adults. Still, polio, like most other infectious diseases, continued its downward trend of incidence prior to the introduction of its vaccine.

As the historical evidence indicates, vaccines simply speeded an already-occurring disappearance of infectious diseases in developed countries. Without advances in nutrition to ensure basic immune system function, and sanitation to prevent the spread of pathogens, infectious disease would probably have persisted despite vaccination.  Tuberculosis is a good example. During parts of the 19th century it was responsible for one quarter of all deaths in Europe. It no longer troubles the developed world, despite the fact that we never effectively vaccinated against it in America or Europe.  However, it still causes between 1.5 and 2 million deaths per year in impoverished countries whose citizens have poor health and sanitation, despite widespread vaccination in such countries.

In conclusion, while vaccines do possess varying degrees of effectiveness, and can help to reduce the incidence of disease, they are not our most important form of protection against disease.   As Bechamp theorized, the explosion of infectious disease in the 19thcentury was really due to a relatively brief, but steep, reduction in general health, which, when paired with unsanitary living conditions, made epidemics inevitable. Our strategy for acquiring better immunity to all diseases, or providing the conditions for such immunity to our children, should primarily be to maintain good nutrition and health through breastfeeding, consumption of natural whole foods, clean water, regular rest, regular exercise, and reduction of stress.

 

In next week’s article, entitled “Two Approaches to Vaccination,” I’ll discuss the underlying worldview behind the modern-day vaccine schedule and contrast it with a more holistic approach to public health.

Vaccination, an Overview, Part 3 The New Epidemic

In America today, what infectious diseases remain, such as the flu, are not as life-threatening, and infant mortality has drastically decreased from just a century ago. Children of today are highly likely to make it to adulthood. Coinciding with the reduction of infectious disease, however, has been a corresponding emergence of an entirely new kind of health problem in children: chronic disease. Children in ever greater numbers are suffering from immune system disorders and developmental delays which have no known cause or cure. Eczema, hives, hay fever and food sensitivities have been increasing since the 1920s, with rapid surges occurring in the 1960s and the 1980s, and these allergies now occur in the tens of millions. Asthma has been increasing since the 1960s, particularly in developed countries, and it now affects 6 million children in the U.S. Attention-Deficit Hyperactivity Disorder has tripled in incidence since the 1970s. Autism spectrum disorders have grown from 1 in 2,000 in the 1960s and 1970s to 1 in 150 today, with the greatest spike occurring from 1996 to 2007.   All of these increases in incidence are too great to be explained solely by genetic mutations (although genetic susceptibility does seem to be a factor) or by evolving diagnostic methods and definitions.  Consequently, an external, environmental agent (or agents) must be triggering them. Since these diseases are chronic but seem to be unassociated with any pathogen and not infectious, they cannot be explained by the germ theory of disease, and scientists possess no alternative theory that would explain what in our environment could be triggering these types of health problems.

It is worth noting that our environment has changed drastically over the last half-century. Our food, water and air are less likely to be contaminated by bacteria like tuberculosis or cholera but are more likely to contain pesticides and other potentially toxic chemicals. Children who used to run and play outdoors, using up their excess energy and exposing their immune systems to many different natural substances, from pollen to poison ivy, now spend most of their time indoors in school or sitting still in front of a screen at home. At the same time they have adopted diets high in excess calories and low in nutrients. Antibiotics and pasteurization have reduced the presence of both bad and good bacteria in their lives. This new lifestyle could be the culprit for children’s hypersensitive immune systems and hyperactive behavior, or it could at least be a contributor. When it comes to autism spectrum disorders, however, many parents believe that vaccines play a major role.

Vaccines have never been completely without side effects, and even the safest vaccines will cause temporary side effects (such as pain and swelling, fever, vomiting, diarrhea, rashes, headaches and crying) between 5% and 40% of the time.  Serious side effects are usually some form of inflammation: Guillain-Barre syndrome  (an autoimmune disorder causing paralysis) and encephalitis (inflammation of the brain).  However, these are said to be extremely rare. A vaccine for which the serious side effects were found to be relatively more common was the first combination vaccine, DTP (diphtheria, tetanus and pertussis), which was released on the market in 1946. In the 1970s and 1980s there was a growing awareness that the pertussis portion of the vaccine, which used a whole-cell B. pertussis bacteria, was responsible for a higher-than-expected rate of reactions such as convulsions, shock, cardiac distress and brain damage. In 1981 Japanese scientists developed a new vaccine that used a safer acellular pertussis component, and caused far fewer reactions, but this form of the vaccine was only adopted in the United States in 1996, after many years of lobbying by parents who had observed their children react adversely to the DTP vaccine.

As was the case with the DTP vaccine, suspicions of a link between autism and vaccines have their initial basis in the case reports of parents who see their children lose previously acquired mental and social skills following doses of vaccines, the majority of which are administered in the first two years of life, the same timespan in which autism usually appears. This correlation could be explained as a coincidence, but the issue is complicated by the fact the rates of autism have increased in conjunction with rising number of shots given to children. In 1983, for example, children received vaccines for diphtheria, tetanus, pertussis (given together as DTP), polio, and mumps, measles and rubella (given together as MMR). This schedule represented vaccines for 7 diseases in the first 4 years. There were 6 shots containing 18 doses of vaccines plus an additional 4 doses of the oral polio vaccine, totaling 22 doses of vaccines. In the year 1995 the schedule was largely the same, except for the addition of the vaccine against Haemophilus Influenzae Type B (HIB) a bacteria that causes meningitis. After that, however, the number of vaccines began to increase.  By 2007, children following the standard schedule were receiving 40 total doses of vaccines against 14 diseases, double what had been given a decade previously. At the same time the number of shots did not greatly increase, because new combination vaccines became available that combine four or five vaccines into one shot. The result has been a significant increase in the amount of foreign material injected into a child’s body at one time.

As discussed in last week’s newsletter, the ingredients of a vaccine must be carefully balanced and formulated in order for the vaccine to be both safe and effective. The typical vaccine components mentioned in the first section – the pathogen, the tissues in which it is cultured, an adjuvant to help stimulate immunity, and a preservative to protect the vaccine from additional pathogens – are each capable of causing unwanted side effects.Live viruses and bacteria, found in the DPT and MMR vaccines among others, are better able to stimulate immunity, but are more likely than weak or killed pathogens to cause a persistent infection and excessive inflammation, including inflammation of the brain (encephalitis) and subsequent brain damage.  Animal or human tissues in which pathogens are cultured contain proteins similar to those contained in our own tissues.  In reacting to the pathogen in a vaccine, some immune systems may see these proteins as part of the threat, and produce autoantibodies against them. These autoantibodies can’t tell the difference between the injected proteins and body’s proteins, resulting in chronic inflammatory autoimmune disease such as Guillain-Barre syndrome, arthritis or multiple sclerosis. The most typical adjuvant in vaccines, aluminum, is a metal that has been linked to Alzheimer’s disease, dementia and brain damage, and it may be difficult for some children to detoxify. As for preservatives, some vaccines contain formaldehyde, a carcinogen, and most vaccines previously contained thiomersal, a form of mercury, before vaccine manufacturers agreed to provide mercury-free vaccines upon request several years ago. Could these ingredients, as they are injected into children with greater frequency and in greater quantities, be responsible for the increasing incidence of chronic immune hypersensivity and developmental disorders in children?  Clearly, not all children have negative long-term reactions to vaccines; in fact, it seems that most of them don’t.  But might some children have a genetic susceptibility to having adverse reactions to vaccines, particularly when administered according to the current schedule?

What are the facts of the situation? First, vaccines carry the potential for adverse effects, including brain damage.  Second, there is a parallel between increasing autism rates and the increased number of vaccines given.  Last, autism typically emerges in children during the period of time when vaccines are administered.  What have we proved?  Nothing.  These facts are not proof of a causal relationship between vaccines and autism–they only show a correlation.  However, this correlation makes a causal relationship a possibility worth investigating, especially since no other cause of autism has been identified. Accordingly, many scientific studies have been done on whether a link between vaccines and autism exists. The initial safety studies done on each new vaccine by Merck, Sanofi Pasteur, Wyeth, and GlaxoSmithKline (the four large pharmaceutical companies that manufacture almost all vaccines), the results of which are reviewed by the FDA and the CDC’s Vaccine Adverse Events Reporting System (VAERS), have not found a link for any individual vaccine. Doctors and research scientists, most notably the independent, non-profit Institute of Medicine, have conducted many additional studies over the past two decades, as well as comprehensive reviews of earlier research, and the vast majority of them have also concluded that no link can be proven, thus confirming the scientific consensus that the serious side effects of vaccines are extremely rare and do not include autism.

The most famous study that did hint at a possible connection between vaccines and autism was published in 1998 in The Lancet, a British medical journal that is perhaps the most respected in the world. The lead author, Dr. Andrew Wakefield, and twelve of his colleagues, argued, based on observations of twelve children with both inflammatory bowel disease and autism, that the children might have a new syndrome caused by the vaccine-strain measles virus, which was found in their intestines. Because the children were previously normal, Dr. Wakefield suggested an environmental trigger might be the cause of the syndrome, and called for the MMR vaccine (measles-mumps-rubella combination) to be discontinued in favor of separate vaccines administered at separate times, until more research could be done. However, the British government felt that to do so would increase the exposure of children to the three diseases. The results of the study were widely reported in the news media, and with MMR remaining the only vaccine available, many parents did not vaccinate their children against the diseases at all.

In the years that followed, both Wakefield and the study received increasing criticism. Other scientists did similar studies and reported that they had failed to duplicate the results. A journalist investigating Wakefield found that he had ties to a lawyer preparing a lawsuit against the MMR manufacturers, and that he had a patent on a new measles vaccine, both indicative of serious conflicts of interest. Ten of the twelve co-authors eventually disowned the paper. Earlier this year, The Lancet itself finally retracted the paper, and Dr. Wakefield lost his license to practice medicine in the UK.

In light of this evidence, it would seem that the possibility of any link between vaccines and autism has been thoroughly eliminated. But for a variety of reasons, we must question the credibility of those who signed off on vaccine safety, who authored and reviewed pro-vaccine studies, and who have promoted vaccines in the media. To begin with, the general public has long had good reason to distrust the ethics and integrity of the pharmaceutical industry, which has been known to disguise or minimize knowledge of adverse reactions to its products (such as Avandia, Vioxx and Fen-Phen). It has also been known to aggressively market its products to as wide a customer base as possible — even urging in recent months, with governmental approval, cholesterol-lowering drugs on people who do not even have high cholesterol. Vaccines are a guaranteed lucrative investment, given that they are prescribed equally to almost every individual in the country.

An additional strike against the pharmaceutical companies’ assurances of safety is that they are not responsible for adverse side effects of the vaccines they manufacture. In the 1980s, as more parents whose children had been injured by the DPT vaccine began to bring lawsuits against vaccine manufacturers, those manufacturers threatened to stop making vaccines entirely, reasoning that it would be unprofitable to continue if they had to pay expensive personal injury claims. In order to ensure that vaccines remained available to the public, the U.S. government stepped in and passed the National Childhood Vaccine Injury Act, which set up a special government court for hearing vaccine injury claims, and awarding damages up to $250,000.  The damages are funded by proceeds from a tax on vaccines, thus shielding vaccine manufacturers from any financial liability. Claims are argued before a government-appointed judge rather than a jury, and while most claims are rejected, the court has had to award almost $2 billion in damages since its inception.

Clearly, pharmaceutical companies manufacture vaccines for profit, not out of an overriding concern for the safety of children. It is not likely that they would abandon profitable products such as vaccines even if they knew that such products caused relatively frequent and severe side effects–just as they knew, but kept secret, the fact that Avandia increased the risk of heart attacks, for example. It is therefore prudent not to accept at face value claims (and by claims, I mean advertising) by the vaccine manufacturers, and by the scientists whom they employ, that vaccines are extremely safe.

What about the government’s independent oversight and regulatory authority? Unfortunately, as in so many industries (including banking, energy, and health care) a revolving door of employment exists between the pharmaceutical companies and the federal authorities that regulate them. An example is Dr. Julie Gerberding, who directed the CDC from 1998 to 2009. This was the period during which the number of vaccines administered and the number of autism cases greatly increased. Dr. Gerberding waited exactly one year and one day after leaving the CDC – the legal minimum – before taking on the job of President of the Vaccine Division of Merck Pharmaceuticals. Gerberding, during her CDC tenure, heavily promoted Merck’s new-to-the-market HPV vaccine, Gardasil, as well as the safety and effectiveness of vaccines in general.

As for scientists and medical doctors who conduct research on the safety of vaccines, many rely on the financial support of the pharmaceutical companies to carry out their research.  Without that support, they would be unable to carry out wide-ranging, long-lasting epidemiological studies of vaccine reactions. The most vocal and media-friendly proponent of vaccine safety, Dr. Paul Offit of the Children’s Hospital in Philadelphia, happens to be the co-inventor of the Rotavirus vaccine RotaTeq (also manufactured by Merck).   Offit has received royalties totaling $182 million from RotaTeq alone.

The conflicts of interest described so far have their origin in greed, but some conflicts can arise from humanitarian motivations. Most public health officials have concerns that if doubts about vaccine safety are given a more thorough hearing, a majority of parents might choose to vaccinate their children less, or not at all (as we saw happen in the aftermath of the Wakefield study publication) and consequently return us to an era of epidemic disease rivaling that of the 19th and early 20th centuries. The authorities may be unwilling to give a fair hearing to the possibility of a vaccine-autism link to serve the greater good. It’s possible that, even if Dr. Wakefield was partly right in his conclusions, the government and scientific community may have been driven by these types of fears to dissect his work for errors and conflicts and to magnify those flaws.

With so many powerful institutions – pharmaceutical companies, government, and scientific bodies – motivated for a variety of reasons to disprove a link between vaccines and autism, it is unlikely that any individual scientist or pediatrician is willing to stake their reputation, potentially even their license to practice medicine, by publishing (or even conducting) a study indicating greater-than-reported side effects of vaccines.  Not only would funding for such a study be difficult to obtain, any flaws in its methodology will be far more heavily scrutinized than if it were to confirm what has already been promoted as scientific truth.

If so many conflicts of interest are at work, shouldn’t we expect to see weaknesses in the pro-vaccine studies?  In fact, on closer examination, many of the studies showing that vaccines are unrelated to autism have significant methodological flaws or are reported to have broader conclusions than they really do. To take a recent example, an epidemiological study by researchers from the University of Louisville School of Medicine was published in Pediatrics magazine on May 24th of this year, stating that giving children vaccines on schedule had no negative effect on long-term neurodevelopment. Most news outlets reported that the study had shattered the “myth” that a delayed or alternative vaccine schedule was safer than the standard, CDC-recommended schedule. However, the study was based on data from a 2007 study published in the New England Journal of Medicine intended to determine whether increased amounts of thiomersal in vaccines caused greater numbers of neuropsychological disorders. That study contained a disclaimer noting that children with autism spectrum disorders were specifically excluded from the data set. Consequently, such children were not examined in the recent study either, and the authors acknowledged that they were restricted in their ability to assess outcomes such as neuro-developmental delay, autism, and autoimmune disorders. The differences between the two groups that were compared were also not significant. Those who were placed in the “timely” group received the recommended 10 vaccines in their first seven months while the “untimely” group received an average of 8. The untimely group, though their shots were delayed, did not actually receive fewer vaccines at each doctor visit, and the study indicates that they may have missed vaccines for socioeconomic reasons rather than intentionally abiding by a different schedule. Finally, the study was only of children receiving shots from 1993 to 1997, the period just prior to that in which the number of vaccine shots increased dramatically.

While, as stated above, these types of omissions and flaws are characteristic of most of the pro-vaccine studies, the fact that Dr. Wakefield’s study has been discredited as well is not necessarily comforting for those wanting to be reassured about the safety of vaccines, as it indicates that his conflicts of interest, as well as an error-filled study, somehow escaped the notice both of the editors of the Lancet and of the dozen co-authors who participated in the research. It must be concluded that we cannot simply take for granted the results of scientific studies from even the best medical journals, having seen what happens when they are subjected to intense scrutiny.  And, above all, we must keep in mind that such scrutiny is not likely to be applied to studies that confirm the scientific consensus on vaccines.

To better determine whether a connection might exist between vaccines and autism, we would need a long-term study comparing the health problems of a control group of completely unvaccinated infants against another group that has the standard vaccine schedule, and possibly additional groups that follow selective or alternative vaccine schedules. No study of this type has yet been done.  Pro-vaccine groups argue that such a study would be unethical, assuming ahead of time that vaccines are safer than the alternative, though this is what the study would be meant to determine.  Though such a study would be expensive, anti-vaccine groups might be able to fund it, were it not for the fact that, having staked their reputations on a link between vaccines and autism, they could not be considered an objective sponsor. Perhaps the main obstacle, however, is that a study of this type would require a large number of children to go unvaccinated and potentially susceptible to disease, and no public or private institution would want to take responsibility and liability for these potential adverse effects. Of course, autism is itself an epidemic that must be addressed, but as long as its cause remains unknown, no institution is officially liable for it.  Only the families of autistic children bear the burden for it.

As the controversy rages on, fewer parents are taking the medical establishment (including the CDC) at its word.  On May 5th, 2010, the CDC announced the results of a study they had conducted on parental compliance with the current recommended vaccine schedule. The percentage of parents who refused or delayed at least one vaccine for their children had increased from 22% in 2003 to 39% in 2008. Why? The parents cited concerns about the safety of vaccines, particularly the risk of autism. If the risks of vaccines are in fact much greater than reported, these parents seem to be making the right choice. However, one must not forget the reason why we vaccinate in the first place: to protect our children from infectious diseases. Eliminating one of the possible causes of autism from your child’s life won’t do them any good if they suffer permanent damage or death from polio, measles, diphtheria, tetanus or meningitis. Therefore, suspecting that the side effects of vaccines may be greater than reported leaves us with no easy decision to make. The overarching question that remains is the same that has pursued us throughout human history: how do we safely protect our children from disease?

 

We’ll take a stab at answering that question in next week’s newsletter, “Building Immunity.”

 

Vaccination: An Overview (Parts 1 and 2)

1. How Vaccines Work

 

We live in a world permeated by microorganisms of all kinds – bacteria, fungi, even microscopic animals and plants. Microorganisms interact with human beings in a number of different ways, in many cases seeking us out as their hosts for mutual benefit. Probiotics, for example, are various species of bacteria that live in our intestines, helping us digest our food and absorb nutrients. But some viral and bacterial microorganisms, known as pathogens or germs, cause disease and death in their human hosts rather than coexisting in a mutually beneficial relationship. Vaccination is meant to be a way of protecting us from these pathogens.

Generally speaking, a vaccine is a biological solution, prepared in a laboratory, that contains a weakened or killed virus or bacteria. A person who receives a dose of a vaccine containing a microorganism becomes immune to the disease caused by that microorganism. For example, the measles vaccine grants immunity to the measles virus and thereby to the disease the virus causes. The vaccine accomplishes this by taking advantage of the amazing immune system that exists in the human body.

The immune system is a network of biological processes that combine to protect us from infectious agents such as the pathogens mentioned above. Components of the immune system include physical barriers like skin and mucus but also interior protective agents such as white blood cells and interferons (proteins that protect us from viruses). Our most complex and advanced form of immunity, known as adaptive immunity, involves antibodies (aka immunoglobulins). Antibodies are specific proteins that the immune system produces upon encountering a foreign substance such as a microbe (aka an antigen). An antibody enables the body to more quickly recognize and neutralize the antigen to which it corresponds. As a result, after just one encounter with a pathogen, we can become permanently immune to it upon any future encounters. In other words, due to our ability to produce antibodies, we are able to adapt to an attack such that the same attack won’t work on us twice.

When we are injected with a dose of a vaccine containing a weakened or killed virus or bacteria, the immune system kicks into gear and fights off the pathogen, at the same time producing antibodies against it. Ideally, the pathogen will be weak enough to pose no danger to the body, but strong enough to still stimulate antibody formation. That way, if we encounter the pathogen in the future, we’ll have the antibodies ready to fight it off regardless of its strength. In other words, we’ll be immune to it.

Most vaccines contain, in addition to the pathogen, the following ingredients: animal or human tissues, which serve as a medium in which the pathogen can be cultured; a preservative (such as thiomersal, a mercury-containing compound, or formaldehyde) to keep other pathogens from contaminating the vaccine; a stabilizer such as MSG, to prevent the vaccine from being damaged by heat, light, acidity or humidity; and an “adjuvant,” usually aluminum, which is a substance that increases the response of the immune system. These ingredients, which differ depending on the vaccines, are the result of many decades of research on how to make vaccines safe, effective, and cost-effective.

The most crucial balance to strike in making a vaccine is between a too-strong pathogen and a too-weak one. In the former case, the pathogen may overwhelm the recipient’s immune system, resulting in disease; in the latter case, the pathogen may not stimulate lasting immunity. For example, the oral polio vaccine, which used a live polio virus administered in a similar manner to the way the actual polio virus is contracted, actually caused polio and subsequent paralysis in a small number of children each year before it was discontinued in the early 2000s. For this reason many vaccines are injected, entering the body via the bloodstream, and feature weakened or killed pathogens, relying partially on the afore-mentioned adjuvants for additional stimulation of the immune system. However, this method, presumably because it bypasses certain aspects of the immune system, sometimes does not result in lasting antibody production, in which case it does not confer permanent, lifelong immunity in the subject (hence the need for recurrent “booster shots” of certain vaccines ). In contrast, immunity from a naturally contracted infection is more likely to be permanent, but the risk of serious disease is much greater when acquiring immunity in this way. This dilemma of safety versus effectiveness, of stimulating immunity without harming the patient, has been present since the earliest and most rudimentary attempts at vaccination.

 

2. The History of Vaccination

 

Observing the progress of the Plague of Athens in 430 BC, the Greek historian Thucydides wrote that the plague (now thought to be typhus) “never took any man the second time so as to be mortal.” Those who got sick but survived did not have to fear dying from the disease later on. Similar observations of adaptive immunity may have been what led seventh century Buddhist monks to adopt the practice of drinking a small amount of snake venom to make them immune to the poison from an actual bite. In ancient China, the most threatening disease was smallpox, and by the 10th century one Buddhist nun had found a method for treating smallpox with inoculation. Inoculation, a more general term than vaccination, is the placement of something into a medium in which it can grow and reproduce, such as a plant part grafted on to another plant, or an antigen into a human body. Inoculation with smallpox for immunization purposes is known as variolation.  Over the next few centuries, variolation became common practice in China as a means of providing some protection against smallpox.

Ancient Chinese methods of variolation generally consisted in drying and pulverizing smallpox scabs from people with mild cases of smallpox and blowing the scab powder into the nostrils of healthy people. The mild cases were chosen for the same reason that vaccine makers now often use weakened or killed pathogens: to reduce the risk of inducing a serious infection. Another form of variolation was to have healthy children wear the undergarments of infected children for several days – a tactic similar to the chickenpox playdates of the 20thcentury, prior to the invention of the chickenpox vaccine.

Similar forms of variolation were eventually practiced in India, Byzantium and the Middle East. Due to various causes including the Crusades, the slave trade, and other forms of trade, smallpox spread to Europe and the Americas, and variolation followed.  Variolation techniques now included applying smallpox scab powder to cuts or scratches on the skin, and the process was slowly accepted in the West as a preventative against the disease, though many distrusted it based on its Oriental origin. The major drawback of variolation, however, was that people occasionally developed serious cases of smallpox from the procedure, and either died or suffered scarring and blindness. People sometimes feared the preventative almost as much as the disease itself.

In the eighteenth century, smallpox was widespread throughout England, but one group of people were curiously unaffected by the disease: dairy workers. Through their contact with cows, dairy workers typically became infected with cowpox, a disease similar to smallpox but much less dangerous, which was spread by touch from the infected udders of cows to humans. Cowpox was similar enough to smallpox that the antibodies produced by the infected workers could fight off smallpox microbes as well as cowpox microbes. One of the first people who took advantage of this phenomenon to deliberately induce immunity was an English dairy farmer, Benjamin Jesty.  In the year 1774, during a local smallpox epidemic, Jesty infected his family with the cowpox virus that had already infected his servants and workers. The family easily recovered from the cowpox virus and were untouched by smallpox.

Other farmers carried out similar experiments with success. Eventually, word of this immunization method reached the surgeon and scientist Edward Jenner, who in 1796 decided to test it out by inoculating his gardener’s eight-year-old son with pus from a milkmaid’s cowpox blisters, and then deliberately injecting him with smallpox (scientists had a little more leeway to experiment freely back then).  Since the smallpox virus did not appear to affect the boy, Jenner announced that he had been successfully “vaccinated,” deriving the term fromvacca, Latin for “cow.” Jenner continued to test vaccination on dozens of additional subjects with immediate success, and thanks to his connections in scientific and government circles, was able to widely publicize his findings. He also founded an institution to promote his method, and the British government soon banned variolation in favor of vaccination.

Over the course of the 19th century, vaccination against smallpox became standard practice in most European countries, and was in some cases mandatory. However, smallpox epidemics continued, particularly during times of stress and upheaval.  During the Franco-Prussian war of 1870-72, a smallpox epidemic struck France and Germany and killed over 100,000 people. Jenner himself became aware that both the safety and the effectiveness of the smallpox vaccine were less than ideal. He had discovered that a significant number of people still developed smallpox even after vaccination. They also sometimes became infected with other diseases that had contaminated the vaccine. As for the immunity from vaccination, it generally only lasted 3-5 years and then began to decline.

What Jenner did not know was the nature of smallpox and how it was transmitted. Only by the end of the 19th century did scientists investigating both smallpox and the many other infectious diseases that were prevalent at the time (tuberculosis, diphtheria, cholera and typhus, among others) come up with the famous germ theory of disease. The germ theory stated that each individual infectious disease was caused by an individual, microscopic, living organism. The noted French chemist Louis Pasteur was a major contributor to the theory, having proven that microscopic organisms, good and bad, do not generate spontaneously but reproduce by subsisting on nutrients, and can be airborne or anaerobic. Pasteur subsequently put his discoveries to use in developing pasteurization, the method of heating liquids to kill most microorganisms present within them.

 

The germ theory of disease enabled scientists to more easily develop vaccines against infectious diseases besides smallpox. Pasteur himself worked on vaccines against rabies and anthrax. Aided by his expertise in microbiology, he discovered methods for attenuating (weakening) bacteria in vaccines so that the vaccines could confer immunity with less risk of actually causing disease.  In the following decades, scientists further refined and improved the techniques of vaccine development, introducing vaccines for diphtheria, tetanus, and whooping cough prior to World War II. A polio vaccine was developed during the early 1950s. Since then, vaccines have been developed for many other infectious diseases: measles, mumps, rubella, hepatitis A and B, meningitis, chickenpox, flu and most recently HPV and rotavirus. Today, each disease against which we routinely vaccinate has a small or nonexistent incidence in the developed world. If the 19th century was the Age of Infectious Disease, the 20th century was the Age of the Vaccine.

 

Health Food Store Shopping List

In the old days, health food stores were small, grungy, lovable, hole–in–the–wall establishments that carried a few basics for health food nuts: organic carrots, tofu, brown rice, sea vegetables, carob chips, etc. As healthy eating became more popular, these stores multiplied to the point where almost every major town in America had a local health food store. With that multiplication came expansion: in addition to rice, beans and greens, you could also acquire healthier versions of the chips, crackers and cookies carried by conventional supermarkets.

In recent years, the health food store market has been cornered by Whole Foods, a mega–chain that drove many smaller stores out of business. While Whole Foods has made health food more accessible for many people, it may have missed the point of the original health food store. At many Whole Foods stores, it has become almost impossible to find bulk brown rice, or macrobiotic foods, within the countless aisles of organic soda or breakfast cereal made with cane sugar instead of corn syrup. While those processed foods are better than the counterparts you’d find at the local Walmart, they signify that just because a food is sold in a health food store doesn’t mean it’s healthy. What follows is my slimmed–down guide to the essential foods you need from your local health food store:

Fruit. The best tasting and most nutritious fruit is fresh, local and organic, qualities which you can usually only find in a health food store or at a farmer’s market. Make local and in season your priority, followed by organic. Fruit can be expensive, but it’s worth the cost. If you need to, shop at a conventional supermarket for fresh fruit rather than go without entirely.

Vegetables. Health food stores carry a wide variety of fresh, local and organic vegetables. When purchasing vegetables, you should try to incorporate a variety of different vegetable groups, which include greens (such as kale or collards), roots (like carrots and beets), bulbs (ex. onions or celery), gourds (squashes), and nightshades (tomatoes, potatoes, eggplant, etc). Vegetables should generally be stored in the crisper at the bottom of your refrigerator. Greens need to be kept in a plastic bag with the air pressed out, and they should be wrapped in a paper towel or two first so that the water on them is soaked up. Greens can be kept until they turn yellow (which takes about a week or two). Other vegetables will stay good for several weeks but generally it’s best to use them quickly, as they lose nutrition over time.

Grains and Beans. Whole grains and beans are kept in bulk bins, and are very inexpensive when purchased this way. You can get much of your calories and protein from these foods, which take some time to cook (especially beans, which need to be soaked and then boiled), but they last a long time and once cooked, can keep in the fridge for several days. Uncooked dry beans and grains keep for a year or more.

Nuts and Dried Fruit. These are good choices for snack foods, but they are not meal replacements. Only snack between meals if you’re still hungry despite having had a solid breakfast, lunch and dinner. These foods can also be expensive. Don’t try to cut back on fresh fruit, vegetables, or all–natural animal products so that you can buy snack foods.

Dairy Products. Raw dairy products from healthy cows are best, but in most states these cannot be sold in stores. Cheese is an exception—choose unpasteurized cheese when it is available. If you don’t have a source of raw milk, grass–fed organic pasteurized milk is the next best thing (non–homogenized is good too). Butter is best when cultured and unsalted. Yogurt should have no added sugar; add your own natural sweetener (such as raw honey) instead.

Fish. Fish is very good for you if it is wild caught, rather than farm raised. Wild caught fish is more expensive, so you may have to have it only occasionally, which is okay, especially since mercury in many species of fish is a concern. Sardines are a low–mercury, less expensive option.

Poultry, Pork, Beef, Eggs. Meat and eggs can be an important part of your diet and a good source of protein and fat; the meat must come from a healthy animal. For poultry, choose organic and free–range (or at least free–range), and hormone and antibiotic free. Pork should be organic if possible. Beef should be organic, but more importantly, grass–fed. Eggs should be from organic and free–range chickens. Be sure to check out local options.

Herbs and Spices, Salt, Natural Sweeteners. All of these condiments should be staples in your kitchen. Start building up a collection of herbs and spices and natural sweeteners (esp. raw honey), and use sea salt instead of regular salt. Buying herbs and spices in bulk is more cost–effective and you can buy less of the ones you won’t use as much. Most health food stores have a bulk spice section separate from bulk grains and beans.

Macriobiotic Foods. The original standbys of the health food store, these foods can now be difficult to find. However, they are usually grouped together, and include tamari (a natural form of soy sauce), brown rice vinegar, umeboshi plum paste, tekka, gomashio, and sea vegetables (nori, kombu, wakame, arame, hijiki). In the cold section you can also usually find the macrobiotic foods miso, tofu, tempeh, seitan, and mochi. These foods originate in the traditional Japanese diet and are all very nutritious and beneficial to health.

Oils, vinegars, sauces, nut butters, pastas, pickles, etc. Not all foods you buy need to be whole foods. It’s not convenient to buy your own olives to make olive oil, or grind your own peanuts for nut butter, for example, and you wouldn’t necessarily come up with a better quality product. In this sense some pre–made foods are perfectly fine, as long the ingredients themselves are whole foods. Peanut butter that contains peanuts and salt—fine. Peanut butter containing peanuts, salt, and hydrogenated vegetable oil—not so good. Pasta that is made from whole wheat is much better than pasta made from white flour. Olive oil should be unrefined and unfiltered rather than filtered and refined. Generally, the fewer ingredients—and the more whole–food the ingredients—the better.

Bread. People are always confused about what bread to buy, but the answer is fairly simple. Choose bread that is made from 100% whole grain flour (i.e. whole wheat, whole rye, etc.). If it says simply “wheat flour” or “whole wheat flour and white flour,” skip it. 100% whole grain flour molds quickly, since it is so nutritious, so keep it in the refrigerator or freezer. In fact, many whole grain breads are kept in the freezer section of the health food store. Don’t be afraid to try something new!

Supplements. If you eat the foods listed above, you really don’t need supplements. You may occasionally benefit from certain herbs, if you happen to be sick. But the most important thing is to be eating a good diet so you don’t get sick in the first place. When it comes to injuries such as bruises, cuts, stings/bug bites, and burns, the supplement section has some effective remedies such as arnica, calendula, stingstop, and aloe vera.

If any of the foods listed above catch your attention, in that you’ve never heard of them or at least have no idea how to incorporate them into your diet, then be sure to contact me with your questions!

Modern Day Malnutrition: Anemia

In a country as wealthy as the United States, with food so abundant and affordable, it seems strange that anyone could suffer from malnutrition. And yet, not only is malnutrition a common occurrence, even the most well–off of our citizens are susceptible to it. The same goes for other developed nations. But it’s not happening because we’re not getting enough food. Developed countries rarely, if ever, have famines and food shortages. Rather, it’s the nature of our food that is causing this problem. Thanks to modern food processing methods, developed countries produce a plentiful supply of food that is high in calories—sugar, white flour, corn syrup, and animal products from animals fattened up on soybeans and corn. While in centuries past, many people died for want of calories, we have more than we could ever eat, and at an affordable price. Unfortunately, those same modern processing methods, though they give us cheap calories, eliminate much of the nutrition from foods. Nutrients are just as important for survival as calories, so with too much of the latter and not enough of the former, it’s easy to end up both overweight and undernourished. You can be eating too much and not enough at the same time! It doesn’t help that, thanks to the structure of our society, high–calorie/low–nutrient foods are the cheapest and the most convenient.

Anemia is a good example of the malnutrition that runs rampant despite the prosperity of our country. Anemia is a blood disorder with symptoms including fatigue, pallor, depression, headaches, lower back pain, dizziness, easy bruising and slow healing, loss of sex drive, brittle nails, hair loss, thin and dry hair, dry skin, and, in extreme cases, shortness of breath and palpitations. The disease is most commonly caused by a lack of dietary iron, folic acid, and vitamin B12. Iron is necessary for the production of hemoglobin, a protein that makes it possible for red blood cells to carry oxygen to our tissues. Folic acid and vitamin B12 are essential nutrients for the formation of the red blood cells themselves. Though such nutrients are readily present in whole, natural foods, anemia affects an estimated 3 to 6 million Americans.

One reason why such deficiencies exist even in people who can afford whole foods is simply a lack of knowledge. Most doctors don’t receive a thorough education in nutrition, let alone the average American, and most people don’t realize that eliminating the cause of their symptoms could simply be a matter of eating better. Another reason is that our society is structured so that processed foods are cheaper and more convenient than more nutritious whole foods. Nevertheless, it would be difficult to find even one anemia sufferer who would really rather endure fatigue, depression and back pain than make some changes in diet and lifestyle that would not just eliminate those symptoms, but make for a more satisfying mealtime as well.

If you are (or think you may be) anemic, nutrient deficiency is very likely the cause. To increase your intake of the nutrients you need, try these recommendations:

–Add more leafy green vegetables to your diet. These include kale, collard greens, cabbage, bok choy, swiss chard and spinach. Leafy greens contain both iron and folic acid, as well as manganese, another important nutrient for iron absorption. They also contain chlorophyll, a nutrient similar to hemin, the pigment that forms hemoglobin when combined with protein.

–Add more iron rich red meat, such as lamb and beef, to your diet. These meats also contain vitamin B12 and the protein needed for forming hemoglobin. However, meat should be from grass–fed animals. Animals that did not eat their greens will have little iron in their own blood, and the meat from anemic animals won’t help you very much to overcome your own anemia. Especially rich in nutrients are organs such as the liver and kidneys, and since blood is formed from the bone marrow, try making a soup with beef soup bones containing marrow.

–Seafood is another good source of iron, B12 and protein, but it should be wild caught. Organic eggs and dairy products from grass–fed cows can also provide the same nutrients.

–Other foods that contain the nutrients you need: whole grains, beans, nuts, dried fruit, and especially sea vegetables such as nori and kombu.

Whether you’re anemic or not, eating more of these foods will without a doubt increase your energy and improve your mood, and since they contain such a wide variety of nutrients, they will address other types of deficiencies as well. So give it a try, and email me with any questions!

How To Have Beautiful Clear Skin…Indirectly

Each year, people spend millions on products designed to improve the appearance of their skin. This is understandable, as the condition of our skin strongly influences our physical attractiveness and self confidence. However, those who focus only on how their skin looks are fighting a losing battle; it’s what is on the inside that matters, and in more than one way. To be more concerned with our physical appearance than our conduct towards others is to invite more stress into our lives, and stress contributes to acne, eczema, and other skin disorders. And to apply products to our skin to clean it up is to ignore the nutritional deficiencies and other health issues within us that are contributing to those disorders in the first place. Just as focusing on losing weight, rather than on health, will either result in failure to lose weight, or in success at the expense of health (e.g. anorexia), focusing on skin care, rather than overall health, will only result in a temporary abatement of poor skin, and a lifelong dependence on care products, rather than lifetime freedom from skin disorders.

The skin is one of our organs of elimination. When there is any excess of toxins in the body, some of them will be carried out of the body by means of sweat, acne, or skin rashes such as eczema. If you eat a diet high in processed foods and low in nutrients, and are not very physically active, your body will come to contain an excess of toxins, some of which it will attempt to remove through the skin, resulting in continual skin eruptions. Excessive hormone production (which occurs during adolescence, menstruation, and during periods of stress) also contributes to skin disorders, as the hormones produced result in clogged pores that slow the elimination of toxins. Clogged pores can harbor bacteria and become infected, further worsening the condition of the skin.

If you would like to have beautiful skin naturally, the approach is simple. Take whatever you might have been spending on skin care products, and devote it to your food budget instead. By adopting a balanced diet of whole, natural foods, you will provide your body with the nutrients it needs to detoxify quickly and easily, while reducing the number of toxins that are going into your system. Reducing stress and increasing physical activity will also speed the process.

At Live Free Nutrition we believe in subtraction by addition, so here are some tips for what you can add into your life to help improve the health (and consequently the appearance) of your skin:

–Eat more foods that are full of nutrients and aid in the process of detoxification: leafy green vegetables (especially cabbage, and the broth made from boiling cabbage), cucumbers, carrots, squash, pumpkin, celery, onions, garlic, sea vegetables, whole grains (especially brown rice and millet), sprouts, and any and all fruit.

–Eat more good quality fat, particularly organic butter, chicken skin from healthy chickens, raw milk and cream, avocados, olives and their oil, eggs with deep yellow yolks, and coconut oil. Skin is mostly made from fat, and fat is necessary for you to digest fat–soluble vitamins A,E, and K, which are essential for skin that is not just blemish–free, but also vibrant and glowing. Eating more good quality fat will help you avoid poor quality rancid fat from processed foods, which contains free radicals that contribute to wrinkles and the general breakdown of skin cells.

–After introducing healthier foods, you will experience a brief increase in skin disorders as your body takes advantage of the added nutrients to thoroughly detoxify. To get this stage over with quickly, apply tea tree oil (a natural antiseptic) to inflamed, infected areas of the skin, and powdered French green clay (mixed with water and daubed on the affected area) to acne in general, as it will draw toxins out more quickly. After the initial detoxification, if you maintain a healthy lifestyle, you will rarely need these products. Some other recommendations:

–Brushing your body in the shower with a stiff skin brush can help the elimination–action of the skin.

–Generally trading in all conventional skin care products, soaps, and shampoos for organic ones or ones without any artificial or chemical ingredients will cut down on toxins, and will probably also eliminate rashes and many other skin problems.

–Ocean bathing, if you can get it, is very soothing to the skin.

–If you can’t find tea tree oil, lemon juice is also a natural antiseptic, and less expensive.

Lose Weight Eating Chocolate. Ask Me How.

I saw these words on a bumper sticker of a car that I passed as our family was driving home from vacation, and they immediately caught my interest. My first instinct was to catch a glimpse of the driver in order to see how healthy he or she looked (answer: not terribly). After all, isn’t it a rather dubious claim that one could eat chocolate in order to lose weight? But of course that’s where the “ask me how” part comes in. Whatever issues we struggle with—health, finances, relationships—we’re always on the lookout for an expert who can promise a solution that doesn’t require us to change anything about ourselves. While chocolate can be part of a healthy and balanced diet, people who eat it to excess due to sugar cravings are likely to put on some pounds. What if there was a way to get one’s “fix” without any consequences? It’s in our nature to seek out purported solutions of this kind, but we know deep down that they don’t really work. Resolving our problems involves making some tough choices. For this reason, there’s another group of people that argues that we just need to toughen up. “You want to lose weight?” they say. “Stop eating so much!” In fact, we tend to be tough like this on people who struggle with things we find easy, while at the same time seeking out miracle cures for our own particular weaknesses.

The reality is that while solving problems does require meaningful change, it also requires practical strategies and support, not just toughness. Eating right, budgeting our finances, or successfully interacting with people are all skills that require practice and knowledgeable guidance to acquire. The good news is that when you’re willing to commit to meaningful change, the battle is essentially already won. After you start eating better, you not only feel healthier and more energetic, but you enjoy your experience of eating more, and you actually find it difficult to go back to your old habits. It’s not a matter of ongoing will power, but of initial willingness. One client of mine, for example, called me up to ask for a healthy alternative to caffeinated soda. He wanted to have the extra energy but without the negative effects on his health. I had to explain that there is no healthy form of a “quick fix”—that the only healthy thing to do was to give his body what it really needed. In this case, that meant extra rest, such as a short nap during the day. To his credit, he was willing to give it a try, and started substituting the real rest for the soda. After a week, he had more energy than before, without needing to sacrifice his health, and had lost the craving for caffeine.

Can you really lose weight eating chocolate? Of course, as long as it’s just one part of a balanced diet of whole foods. But my emphasis on eating more whole foods is not a “toughen up” type of recommendation. If you are really eating healthy, not only will you love it, but any junk food that you used to crave will no longer have the same hold over you. All that’s required is the willingness to take that first step towards real, positive change.