A Food Safety Double Standard

On August 3rd of this year, armed government agents, including representatives from the FDA, FBI, California Dept. of Food and Agriculture, LA County Health Department, and the LAPD, raided Rawesome Foods, a raw food co-op in Venice Beach, California, that was accused of selling raw milk without a license. The agents confiscated cash, computers and files, and carted away or destroyed $70,000 worth of farm-fresh produce. Surveillance cameras show that Rawesome volunteers were lined up against a wall and frisked at gunpoint. The agents arrested volunteer and organizer James Stewart, among others, whose bail was initially set at $123,000 – far more than is typical for alleged drug dealers, child molesters and killers.  And yet, in its 12-year history, Rawesome had never been linked to a single case of foodborne illness, despite the fact that its products included not just raw cow milk but raw goat, sheep, and camel milk, and even unwashed eggs.  In fact, Rawesome had even been raided a year prior so that its products could be randomly inspected – but no dangerous contamination was found. It appears that the authorities spent the interim trying find another justification for shutting down the co-op, even having its agents pose as members for a year to seek out evidence of wrongdoing.

While it’s true that Rawesome did not have a license, the coop did not operate as a purveyor of milk nor a business in the strict sense. All members were volunteers and all the money that was paid for the products went directly to the farmers to cover their costs. Essentially, the members were pooling their money to buy from farmers more efficiently, and each member was required to sign a waiver acknowledging the potential pathogenic content of raw foods.  The same waiver also guaranteed the organic and grass-fed diet  and free-range lifestyle of the cows, goats, chickens etc.  By signing the waiver, the members were taking responsibility for their personal, health-motivated, food choices. Although raw milk can be sold legally in California, it is so tightly regulated that only one or two companies can afford to offer it, and do not provide as much variety as is available when going directly to trustworthy farms. As can be seen from the case of Rawesome, the government is ready and waiting to devote its resources to prosecuting any apparent deviation from the already strict rules.

Meanwhile, also on August 3rd, Cargill, Inc., an agricultural company that is the largest privately held corporation in the US in terms of revenue, and which supplies about 22% of the domestic meat market, announced that it would be recalling 36 million pounds of ground turkey due to contamination with an antibiotic-resistant strain of Salmonella bacteria. The recall was in response to an announcement by the Centers for Disease Control and Prevention and the U.S. Department of Agriculture that the contaminated turkey had been traced to Cargill’s Arkansas processing plant. At the time of the recall, the turkey had been linked to 77 cases of illness, 22 hospitalizations, and one death. More recently, the number of illnesses has been reported at 129 with 33 hospitalizations, all across 34 states. Since the CDC estimates that Salmonella cases occur on the order of 30 times the number reported, that means as many as 4,000 people were sickened by Cargill’s food product. Cargill’s meat (none of which can be identified in stores; most of the contaminated turkey was sold under a brand called Honeysuckle White, with the name Cargill nowhere to be found), it should be noted, has a history of contamination going back decades.

How did this happen? Antibiotic-resistant pathogens develop when antibiotics are routinely used on factory farmed animals that are constantly sick due to the toxic environment in which they are raised.  It’s inevitable that resistant strains will develop and wind up in the meat we eat from these “farms.”  Of course, given that the government is so zealous and meticulous in protecting our nation’s health that they will even locate and shut down the smallest raw milk co-ops (most of which serve a few hundred fully informed people at most) just because of the mere possibility of contamination, they would be even more quick to crack down on a business producing factory farmed meat that actually hospitalized hundreds of unsuspecting people and sickened thousands more, right? On the contrary. As an article in the Wall Street Journal reported,

 

Federal officials said they turned up a dangerous form of salmonella at a Cargill Inc. turkey plant last year, and then four times this year at stores selling the Cargill turkey, but didn’t move for a recall until an outbreak killed one person and sickened 77 others.

 

As for the recall, it was only a request on the USDA’s part, not an order, as the USDA lacks the legal authority to force a recall.  Cargill’s was voluntary and, once the truth was out, aimed at salvaging their image; most of the recalled meat dated back as far as March and had already been consumed. But the most amazing part is that Cargill’s lack of quality control wasn’t even against the law. Federal regulations permit up to 49% of all samples tested at poultry plants to be contaminated with salmonella, and since new antibiotic-resistant strains pop up all the time, thanks to factory farming methods, the government doesn’t have bans on all of them. In fact, only one food-borne pathogen, E.coli O157, is classified as an “adulterant” by the government, meaning it’s against the law for it to be present in food.  In other words, even if inspectors find salmonella contamination, they can’t really do anything about it. No one at Cargill was charged with any crime, nor did Cargill even receive a fine. In their own carefully chosen words, they weren’t even responsible:

 

“It is regrettable that people may have become ill from eating one of our ground turkey products and, for anyone who did, we are truly sorry,” Steve Willardsen, president of Cargill’s turkey processing business, said in a written statement.

 

Ironically, it’s the very same small farms that are in trouble with the government that are producing meat and milk from healthy animals that don’t require regular antibiotics, if they require any at all. That means their meat isn’t contaminated with the “superbugs” present in factory farmed meat, and that their milk is safe enough to drink raw. Yet these farms, which are the antidote to the food safety dangers confronting us, are the ones under pressure.  According to the FDA, it is shutting down on raw milk clubs in order to protect health – particularly the health of children.  From theNew York Times:

 

Siobhan DeLancey, a spokeswoman for the federal Food and Drug Administration, which participated in the investigation of Rawesome, said the administration banned the interstate sale of raw milk products because they could be dangerous for those with compromised immune systems.  “Our biggest concern is really with children, because pathogens that can be in raw milk can be extremely dangerous for the classically at-risk,” she said. “We’ve seen people wind up as paraplegics.”

 

Ironically, the gradual increase in numbers of children with compromised immune systems is likely due to the fact that children with still-developing digestive systems consume, on a daily basis, hard-to-digest pasteurized milk and white bread that ultimately inflames their immune systems and results in autoimmune disorders. Nevertheless, Ms. DeLancey seems uninterested in how these children came to be immunocompromised in the first place.  In fact, the government’s actions have very little to do with promoting health and ensuring food safety, and a lot to do with satisfying lobbyists for large, wealthy food corporations that have influenced legislators and thereby the law so that such corporations are very difficult to regulate, despite their grievous lapses in quality control, while small family farms that produce food according to traditional methods are very tightly regulated and are easy targets for obliteration if they make a single misstep – or even if they don’t. It’s hard not to come to the conclusion that the conventional food industry is using the government to suppress small businesses that produce high quality, fresh food, because such corporations cannot imitate this model – and when it comes to food, more and more people are concerned about healthfulness, flavor, and freshness of food and less concerned about paying the cheapest price regardless of quality.

There are reasons to be wary of small raw-food providers. Raw milk is only as healthy as the cow it came from. It’s happened before that unscrupulous or just plain ignorant dairies have tried to cash in on the raw milk fad by selling some of their milk raw, without taking care to make sure their cows are grass-fed, free-range, and in good health, and that their operations are sanitized. If you’re going to drink raw milk, buy it from a farmer you can trust, and one who also tests his milk for pathogens. They’re extremely unlikely to be in there (in fact, raw milk from healthy cows tests far lower in pathogens than what is required for pasteurized milk), but there’s no harm in being extra safe. If you’ve taken this precaution, you’ve got nothing to worry about.

On the other hand, if you’re buying a product such as ground beef from a faceless company such as Cargill, whose quality you can’t verify until it’s too late, you’ll be on much shakier ground, in hoping such companies police themselves. After all, they benefit so much from agricultural subsidies that they can recall millions of pounds of meat and still keep chugging along. The same tax dollars we’re paying for the authorities to shut down the small farms that sell good quality food, we’re also paying the big food corporations to perpetuate their existence and crush their competition.

However, as more people become aware of what’s going on and choose to pay a little more for truly good food, and then benefit from their healthier, more satisfying diet, the more small farms will prosper despite the pressures being put upon them from above. When enough people want these farms to be legitimized, the laws will change and the pressure will be on companies like Cargill to adapt or fail. To find sources of raw milk and other natural food-producing farms in your area, try http://www.realmilk.com/.

 

Summer Reading Guide

I know, summer is practically over. But if by any chance you’re looking for a captivating and educating health read (besides my newsletters, of course…*ahem*), try any of the books listed below – you’re sure to be both entertained and edified.  The links will take you to the book’s listing on Amazon.com.

The Jungle Effect by Daphne Miller.  Miller, a California MD, decided that the best way to help her chronically unhealthy patients would be to put them on the whole-foods based, traditional diets that their ancestors ate. However, in order to do so, she first had to research those diets. Due to the diverse ethnicities of her patients, she ended up traveling to countries as far-flung as Mexico, Greece and Iceland to learn about these traditional diets in regions where they were still being practiced.  The book contains, in conjunction with anecdotes about how her patients adopted these diets and got healthy, eating plans and recipes for the various traditions she studied.

Food Rules by Michael Pollan.  This one’s short – and memorable. It consists of 64 (usually) one-sentence rules about what kind of food we should eat. The rules are geared towards eating more whole foods, and fewer processed foods – examples include “Avoid foods that contain high-fructose corn syrup,” “Don’t eat anything your great-grandmother wouldn’t recognize as food,” “The whiter the bread, the sooner you’ll be dead,” “Eat your colors,”  “Don’t buy food where you buy gasoline,” etc. You might not agree with every single rule, but they’re certainly thought-provoking and creative.

Nourishing Wisdom by Marc David. This short book, while confirming that eating good quality whole foods is very important, addresses the other factors that go into determining our health, such as our emotional and spiritual nourishment, as well as the importance of “how” we eat: e.g., are we enjoying our food slowly while sitting at a table with friends and family, or gulping it down while driving to work? Some of us need this type of practical wisdom far more than we do more advice on what foods are good and what are bad.

The Self-Healing Cookbook by Kristina Turner. As the title indicates, this is a cookbook as much as it is a health book. Turner writes from a macrobiotic perspective, which means that the recipes center on adding whole grains, beans, vegetables and sea vegetables to your diet. However, Turner also details how different foods can affect your mood and emotions, and gears her recipes towards helping you to establish a balanced physical and emotional state. The exercises in the book also help to figure out what particular foods are best for you and why.

The Energy Balance Diet by Joshua Rosenthal. If any of you out there are determined to find a specific diet plan to follow, I can’t recommend any more highly than this one, written by the founder and director of Integrative Nutrition, where I received my health counseling education. Rosenthal shows how to develop a balanced diet of whole foods that will help you to achieve your correct weight, establish steady energy levels, and understand and address your food cravings. Like all the other books on this list, it’s well-written, easy to follow, and entertaining without being shallow or extreme. Happy reading!

Analyzing the hCG Diet

Today’s most popular crash diet is the hCG diet, which consists in eating no more than 500 calories per day, while supplementing (via regular injections prescribed by a doctor or through lozenges, sprays and drops) with the pregnancy hormone hCG (human chorionic gonadotropin). HCG is naturally produced by pregnant women to maintain the corpus luteum, which it does by causing the body to secrete the uterus-enriching hormone progesterone. Its mainstream medical usage is as an infertility treatment for women.  HCG also helps maintain testosterone production that is otherwise lowered by performance-enhancing steroids, which is why it is banned in professional sports.

What does hCG have to with weight loss? Back in the 1950s, British endocrinologist Albert Simeons claimed that when he gave it to his obese patients in India, they lost weight in just the places where they needed to lose it – but only when they coupled it with an extreme low-calorie diet. The theory was that hCG stimulated the body, when faced with near-starvation, to burn unnecessary fat rather than muscle tissue.  Proponents have also claimed that hCG supplementation suppresses hunger, making a 500-calorie diet relatively sustainable.  Since Simeons published his theories in 1954, the hCG weight-loss fad, like many others, has alternately gone in and out of style, and is currently enjoying a resurgence.

Does the diet work? It depends who you ask. Doctors who provide hCG injections and diet consultations costing over $1,000 per monthcertainly claim that it does, as do websites that offer hCG by mail-order. Also online are many anecdotal testimonies of the hCG diet’s effectiveness, of which an unknown number have been written by hCG salespeople. A Dutch study back in 1995 analyzed 14 randomized clinical trials of the hCG diet and found that in only two trials did people accomplish more on hCG – in terms of weight loss, reduced appetite, and improved figure – than on a diet with a placebo used in place of hCG. This is regardless of whether either diet worked very well at all. The FDA has said with regard to hCG:

 

“HCG has not been demonstrated to be effective adjunctive therapy in the treatment of obesity. There is no substantial evidence that it increases weight loss beyond that resulting from caloric restriction, that it causes a more attractive or “normal” distribution of fat, or that it decreases the hunger and discomfort associated with calorie­restricted diets.”

 

The American and Canadian Medical Associations have also condemned the diet. In addition to being no more effective than a placebo, hCG in excess is known to cause headaches, blood clots, leg cramps, and constipation, and may cause other health problems; its side effects in connection with a starvation diet have never been thoroughly studied.

At this point, to be confident that hCG works, you’d need to have acquaintances you know and trust who have tried it, lost weight permanently, and are still visibly healthy and active. But even if hCG has either no effect or a negative effect, what about just doing the low-calorie diet? It’s possible that hCG does function as a placebo, simply giving people the confidence to stick with the low-calorie plan they need.  But super low-calorie diets, due to malnutrition, will cause not just weight loss but also bone and muscle loss, mental deterioration, and exhaustion, so that even without hCG a 500-calorie diet is dangerous to your health. Although you will lose weight – it’s practically impossible not to when you don’t eat – you will simply gain it back when you’ve finished dieting and have gone back to the old diet that caused you to gain weight in the first place, except that this time your body will have deteriorated further due to the added strain of having dieted. Crash dieting, diet drugs, even anti-obesity surgery to some extent, has never worked, though it’s been tried many times.  If any of these strategies worked, we wouldn’t still be searching for solutions to the obesity epidemic that affects 1 in 3 Americans.

The FDA, AMA and other major government and medical organizations are somewhat culpable here, because even as they scramble to announce that the hCG diet is ineffective and dangerous, they are content to continue to put their hope in sanctioned, mainstream “quick fixes” that consistently fail to pan out.  In October of last year, the FDA had to decline three separate weight loss drugs for approval due to health risks.  Two of the drugs were new (Qnexa and lorcaserin); one had been on the market for 13 years (Meridia). Particularly since the debacle of fen-phen, a weight loss drug approved in the early 90s that was years later shown to cause potentially fatal pulmonary hypertension and heart valve problems, the FDA has had to be more strict about the drugs it approves. Nevertheless, as quoted in the article linked to above, Dr. John Jenkins of the agency’s Office of New Drugs has said that the FDA is “”committed to working toward approval” of new obesity drugs, “so long as they are safe and effective for the population for which they are intended.””

The attitude that a drug, or a device like an obesity lap band, can at some point be an effective way to combat weight gain, when validated by the FDA and our medical professionals, simply encourages the average person to think that they can get away with focusing on the symptom of the problem and simply rely on a quick fix (like the hCG diet).  While this is profitable for both the pharmaceutical industry and the supplement industry, it doesn’t really help those who are overweight.  We in fact need to deal with the root cause of the problem by making sound long-term diet and lifestyle changes. But as I discussed in my article on MyPlate, since the government’s approach to diet and lifestyle is severely flawed, people are extra disinclined to deal with the root causes.

If diet and lifestyle changes are made wisely, however – without crash dieting, excessively restricting foods, or even more than the most moderate exercise – those who need to lose weight will lose about 1.5 to 2 pounds a week, or about 40 pounds in six months. This weight loss will continue until a healthy weight is achieved. This is what has been achieved by clients of mine who have followed my recommendations to eat a balanced diet of whole foods. The best part is that they don’t have to change the way they eat once they’ve reached their weight loss goal, because they aren’t eating to lose weight in the first place, just to be healthy. The weight loss simply happens naturally.  One thing we tend to forget easily is that the human body is meant to be a healthy weight. We think that we need to punish and manipulate our bodies to get them to the weight that’s healthy – but in fact it’s the opposite: we’re punishing and manipulating them when we load them up with high fructose corn syrup, toxins, artificial flavors, free radicals, and hydrogenated oils, and when we’re sedentary instead of active. When you have a willing spirit and the knowledge of how to go about it, getting healthy and in shape is actually one of the easiest and most fun things you can do – no supplementary hormones required.

How to Approach Antioxidants

Antioxidants are one of today’s most popular nutrient groups. Many health books and articles have been written extolling their virtues. Capitalizing on this popularity, food producers tend to prominently advertise the antioxidant content of the beverages, cereals, teas and other items they offer, often directly supplementing their products with antioxidants to increase nutrient content. As with vitamins, minerals, omega-3 fatty acids, and digestive enzymes, the presence of antioxidants is a persuasive signifier of a given supplement or food’s health benefits. However, to really benefit from antioxidants, we must understand why and in what context they are valuable.  Simply consuming more foods that are advertised as containing them (e.g. green tea, chocolate, and red wine) is too simplistic, and can have negative consequences for health.

In order to comprehend antioxidants, we first have to discuss their counterparts, free radicals. Free radicals are molecules that our bodies generate in order to neutralize toxins as well as pathogens such as bad bacteria, viruses and fungi.  Due to reacting with oxygen in a process known as oxidation, free radical molecules lack the electrons they need to be chemically stable. The way they neutralize is by stealing an electron from another molecule, which then itself becomes a free radical. Our bodies use free radicals to start a chain reaction of molecule destruction among whatever toxin or pathogen has invaded our system. Free radicals are also created by our bodies when we’re stressed out, physically injured, or when we exercise.

It’s clear that free radicals play an important role in our immunity. However, if we have too many free radicals active in our systems, they will begin turning on the body’s own cells, damaging those cells’ DNA and turning their molecules into additional free radicals.  This process of cell breakdown, continuing unchecked, is linked to the development of cancer, stroke, diabetes, heart disease, liver damage, premature aging, emphysema, Parkinson’s disease, schizophrenia and Alzheimer’s disease. In fact, unchecked free radical proliferation may be the most ubiquitous health problem of our time.

Excessive free radical activity can be caused by too-frequent stress, infection, or exposure to toxins (toxins include cigarette smoke, polluted air and industrial pollutants, pesticides and herbicides, certain prescription drugs, and radiation), or by too-frequent consumption of rancid vegetable oils, which contain high amounts of free radicals due to their oxidization during high-heat cooking, processing, and lengthy exposure to light. If we’re facing any of these problems, how are we supposed to neutralize the free radicals? Enter antioxidants.

Antioxidants are molecules that are capable of “donating” electrons to free radicals to stabilize them, while remaining stable themselves. Our bodies manufacture some antioxidants, just as they manufacture free radicals. However, a major part of our antioxidant need is supplied by dietary nutrients. Examples of antioxidants are vitamin A, vitamin B2, vitamin B9, vitamin C, vitamin E, selenium, zinc, and a class of nutrients called polyphenols (which contain another class known as bioflavnoids).

It’s easy to see why antioxidants are touted as being so important. Who would not want to add to their diet nutrients that reduce the likelihood of cancer, diabetes, and the other diseases listed above? But rather than fall into the misconception that all we need to do to protect ourselves is eat more foods that are advertised as containing antioxidants, we need to take a holistic approach to the situation. We need a healthy amount of both free radicals and antioxidants. It may be that production of free radicals needs to be reduced, rather than antioxidants increased, as excessive antioxidant intake can cause problems of its own. Also, we should get our antioxidants from whole, natural foods, rather than from either 1. antioxidant supplements, 2. processed foods to which antioxidants have been added, or 3. foods that naturally contain them but also contain potentially harmful ingredients. I recommend that the following strategy should be used for developing a healthy balance between free radical and antioxidant levels:

 

1.      Reduce stress. If you are under continual stress, antioxidants can help somewhat, but the best thing you can do for your health is to actually resolve the stress in whatever way is right and appropriate. There is no way that diet alone can compensate for the damage to health done by ongoing mental and emotional stress.

2.      Reduce exposure to the environmental toxins listed above. One way to do this is by increasing consumption of organic food vs. conventional, and exchanging the conventional cleaning and self care products in your home for those with natural ingredients. Quitting smoking and cutting back on prescription drugs where appropriate will also help.

3.      Avoid consumption of rancid vegetable oils (e.g. fried foods from fast food restaurants). When purchasing vegetable oil for home cooking, choose olive, sesame, sunflower or corn oil that is cold-pressed and contained in dark glass bottles. Use coconut oil or lard from naturally raised pigs for high-heat cooking.

4.      Eat more of the following whole, natural foods that contain antioxidants: all vegetables, but especially leafy green vegetables; all fruits, especially berries; vegetable oils processed in the healthy way described above; dairy products from grass-fed cows; organic eggs; beans; whole grains; all herbs and spices, but especially turmeric, oregano and cinnamon

5.      Moderate your intake of foods that contain antioxidants, but which can actually reduce mineral absorption when consumed in excess: green tea, chocolate, red wine, spinach, swiss chard, and soybeans (unless cooked with kombu, a sea vegetable).

 

Ultimately, the point is that we should not reflexively think “I need that, it has antioxidants,” but focus on eating a diet of whole natural foods, balancing that diet based on our cravings, and reducing stress and exposure to toxins and rancid oils.  If we take that approach, our freeradical/antioxidant balance can be trusted to take care of itself, and we’ll have greatly reduced our risk for the diseases mentioned earlier. This is a holistic approach, one that accounts for both the nutrients we know and those we have not yet discovered. It can be relied upon regardless of what dietary trend is popular at the moment, and will not lead us down the path of eating in an unbalanced way even as we’re trying to get healthier.

The USDA’s MyPlate Eating Guide

On June 2, 2011, the USDA, in conjunction with First Lady Michelle Obama, released MyPlate, which replaced MyPyramid as their guide for how Americans should eat.  For decades, the government has been trying to consolidate nutrition advice from health professionals and pass it on to Americans in a clear, easy to follow format, especially as our obesity, heart disease and diabetes rates have increased over the same time period. However, in giving advice, the USDA also has been careful not to be to too strong in warning people away from the processed food produced by the American food industry, which is both a major part of the economy and a major reason why Americans are so sick. As a result of its conflicting obligations, the government’s advice is often contradictory and confusing, and the MyPlate guide is no exception (though it is marginally better than its predecessors).

The plate that now replaces the pyramid as the icon of how to balance our diets consists of four approximately equal sections, one each for fruits, vegetables, grains and “protein.” There is also a separate cup beside the plate, labeled dairy. The USDA has boiled down its directives to the following simple messages: 1. Enjoy your food, but eat less; 2. Avoid oversized portions; 3. Make half your plate fruits and vegetables; 4. Make at least half your grains whole grains; 5. Switch to fat-free or low-fat (1%) milk; 6. Compare sodium in foods like soup, bread and frozen meals – and choose the foods with lower numbers; and 7. Drink water instead of sugary drinks.

Will MyPlate actually help stem the obesity epidemic? According to Michelle Obama, quoted in the USDA Press Release, “As long as [our kids’ plates are] half full of fruits and vegetables, and paired with lean proteins, whole grains, and low-fat dairy, we’re golden.  That’s how easy it is.” Unfortunately, MyPlate will probably not have much of an effect, despite the fact that some very good advice can be found among the USDA’s recommendations. A major reason for this is that MyPlate, in addition to offering good advice, also offers some bad advice, and fails to offer any advice at all in some crucial areas. As a result, those who try to follow it conscientiously will find themselves feeling hungry and craving junk food after meals, and those who follow it less conscientiously will find plenty of wiggle room within the guidelines for including lots of processed food.

For example, MyPlate instructs us to make at least half of our grains whole grains. Because whole grains are naturally more dense and fibrous than refined grains, and have a more complex flavor, they need to be paired with a healthy fat, like cold pressed olive oil, or butter from grass-fed cows, to be appetizing. Combining whole grains and healthy fat also helps us feel full exactly at the point when we’ve eaten the right number of calories. But in MyPlate, fat is either frowned upon or relegated to the background. In following MyPlate, people will try to eat whole grains with little or no fat, and will find them unpalatable.  They will therefore gravitate towards the refined grains, which they are permitted to eat an astounding 50% of the time. What’s ironic is that refined grains, not fats, are what cause us to gain weight, because while both are high in calories, fat makes us feel full but refined carbs leave us constantly hungry. Thus does MyPlate’s bad advice (reduce fat) actually nullify our ability to follow its good advice (eat more whole grains). What about the other recommendations? Let’s take a look at them one by one.

1.      Enjoy your food, but eat less. The first part of this message is good – food is meant to be a source of pleasure as well as nutrition and sustenance. The second part, however, propagates the misconception that we need to cut down on the foods we enjoy in order to be healthy. In fact, the most enjoyable foods also happen to be the healthiest, and when we eat these foods, we feel full right at the point when we’ve had enough. Only when we’re following a flawed plan like MyPlate do we need to worry about “eating less.”

2.      Avoid oversized portions. Again, when we’re eating a healthy diet, we can let our cravings dictate how big of a portion we need. Sometimes it will be large, sometimes it will be small, but it will always reflect what our body needs at that moment. MyPlate’s vague, one-size-fits-all statement doesn’t offer any concrete guidance.

3.      Make half your plate fruits and vegetables. Most people don’t eat enough fruits and vegetables, so this advice points in the right direction. But fruit shouldn’t usually be eaten with other foods – it’s better digested eaten alone, as a snack or dessert. Though a one-size-fits-all approach still has flaws, a better general recommendation would be making your plate 1/3 vegetables, 1/3 whole grains, and 1/3 meat, eggs or beans.

4.      Make at least half your grains whole grains. Again, this recommendation should really be “Make all your grains whole grains.” If whole grains are better (and they are) we should eat them all the time; there’s no need for refined grains. The USDA doesn’t want to acknowledge this explicitly because so many food products are made with refined grains.

5.      Switch to fat-free or low-fat (1%) milk.  First of all, fat makes you full, not fat, so following this recommendation won’t help reduce obesity. Saturated fat, the type in milk, was at one time linked to heart disease, but it’s since been discovered that the real culprit is theabsence of another type of fat, omega-3 fatty acids, which are lacking in the milk and meat of factory-farmed cows. This recommendation should therefore be “Switch to whole unpasteurized milk from grass-fed cows raised on small family farms.” Finally, milk is not an essential food; the USDA implies that it is in order to satisfy the dairy industry. However, the nutrients it contains can be found in other foods, such as beans, eggs, and green leafy vegetables.

6.      Compare sodium in foods like soup, bread and frozen meals – and choose the foods with lower numbers.  While this is good advice if you’re going to be buying pre-made soup, bread, and frozen meals, the best way to get a healthy amount of sodium, and what the USDA should recommend, is to make your own soup, bread, and meals, adding salt until it tastes right to you.

7.      Drink water instead of sugary drinks. Hooray! The USDA got one right – sort of. People crave sugary drinks often because their diets are already imbalanced – and following MyPlate’s recommendations won’t take away that imbalance. Most people will just not be capable of following this advice, especially if they are eating MyPlate’s way. So the soda manufacturers don’t have much to fear.

 

As a response to MyPlate, I’ve created the following alternative simple five-step eating plan, which I think would vastly improve the health of all people who adopted it, and which contains recommendations that complement one another:

1.      Eat whole foods, or foods with whole-food ingredients. For example, tomatoes, or tomato sauce containing tomatoes, garlic, herbs, and olive oil, but no sugar. With each meal, try to get some of the foods in each of the following macronutrient categories:

a.      Complex Carbohydrates: Grains such as brown rice, whole wheat or whole wheat flour, quinoa, barley, oats, corn, and buckwheat; Starchy vegetables such as potatoes, sweet potatoes, and squashes; Fruits such as apples, pears, melons, bananas, plums, mangoes, oranges, and grapefruit ; Natural sweeteners such as maple syrup, agave nectar, brown rice syrup, barley malt, and raw honey.

b.      Protein: Animal products such as beef, poultry, lamb, pork, milk, cheese, fish and eggs; Beans and bean products such aslentils, black beans, kidney beans, navy beans, chickpeas, tofu and tempeh; Nuts and seeds such as almonds, peanuts, walnuts, sunflower seeds, pumpkin seeds, and cashews.

c.      Fats: Vegetable oils such as olive oil, sesame oil, coconut oil and corn oil; Animal fats such as butter and lard.

d.      Vitamins & Minerals: Vegetables such as  greens (kale, collards, chard, bok choy, spinach, etc.) roots (beets, carrots, radishes, turnips, parsnips, etc.), bulbs (onions, garlic, celery, scallions, etc.); nightshades (peppers, tomatoes, eggplant), gourds (cucumber, summer squash), and many more; Fruits such as berries, lemons and limes; Herbs and Spices such as basil, oregano, thyme, rosemary, sage, garlic, pepper, cumin, cardamom, cinnamon, cloves, ginger, etc.

e.      Microorganisms: Raw fermented foods such as yogurt, kombucha, raw sauerkraut, kimchee, miso, and kefir.

2.      On average, eat about 1/3 carbs, 1/3 protein, and 1/3 vegetables with meals. The right way to balance a diet differs from person to person based on body type, activity levels, climate and environment, season, gender, age, and so on. If you’re eating healthy foods, then by listening to your cravings, you can figure out what balance is right for you at any given time. Some other notes: Fruit, especially raw, doesn’t digest well with most other foods, so it should be eaten as a snack or a dessert. Fats, such as olive oil or butter, can be obtained separately or from eating the whole food in which they are originally found (olives and milk, in this case). Herbs, spices, and sweeteners should all be used in small amounts, to flavor foods. As seen above, some foods contain more than one type of nutrient and can meet more than one requirement at once.

3.      Use good quality ingredients. All plant foods – grains, fruits, vegetables, herbs, spices, vegetable oils, etc. – should be organic when possible. Animal products should come from animals raised on their natural feed, and if that feed is organic, even better. Fruits and vegetables should be fresh and in season; if locally grown, even better. Vegetable oils should be cold pressed and packed in a dark bottle, in addition to organic. Milk is best unpasteurized, if from healthy cows; if pasteurized, choose organic and grass-fed. Even if organic, foods such as beans or tomato sauce are better cooked from scratch than from a can.

4.      Eat home cooked food at least 80% of the time; when eating out, choose restaurants that also follow the recommendations above.  When eating home cooked food, you can fully control the quality of the ingredients, the balance of the meal, and adjust the flavors and proportions as suits your body’s individual needs.

5.      If the above steps are followed, eat in accordance with your cravings.  We’re taught that food cravings are to be resisted. But when we’re eating a healthy and balanced diet, our bodies naturally start to crave those healthy foods, and crave them in just the right quantity and proportion. As a result, once you’ve introduced your body to the healthy way of eating described above, you don’t need to worry about calorie counts, portion sizes, or how much fat or carbs you are getting – you can listen to your cravings, and they will guide you towards foods that will help you to achieve our natural weight.

 

Such an eating plan, if many people followed it, would result in major upheaval in the American – and world – economy. Not only is our food system global, but people in the rest of the world look to Americans as their example for how to eat.  The sugar, dairy, meat, processed food industries would never abide by it, for it would result in almost total abdication from their products in favor of a more local, farm-to-table based economy.

This brings us back to the conflicts inherent in MyPlate. Programs like MyPlate and MyPyramid have tremendous influence, not in getting people to be healthier, but in giving them a flawed conception of what is healthy and what is not, and actually reinforcing them in eating processed food by giving them an unappetizing alternative.  Instead of trying to give people advice that is skewed by vested interests, the government should eliminate the subsidies that make processed food artificially cheap, so that it’s easier for us to make our own choices to eat healthier. While MyPlate is better than previous guides, it’s inherently insincere, as it is committed to the status quo rather than allowing a new food economy that actually supports our health.

Raw Milk, the FDA, and the E. coli Outbreak

On April 19th, the Federal Food and Drug Administration filed a complaint  against Pennsylvania Amish dairy farmer Dan Allgyer, alleging that he had violated federal law by delivering raw milk across state lines. The milk was being purchased on a regular basis by a cooperative of buyers in Maryland, a state that has outlawed the sale of raw milk within its own borders. It is legal to sell raw milk in Pennsylvania, but a violation of interstate commerce laws to deliver it to buyers in other states. According to Dara A. Corrigan, the FDA’s associate commissioner for regulatory affairs,  “Drinking raw milk is dangerous and [it] shouldn’t be consumed under any circumstances…[the] FDA has warned the defendant on multiple occasions that introducing raw milk into interstate commerce is in violation of Federal law.”

However, despite their claims about the danger of drinking raw milk, the FDA could not point to any cases of foodborne illness arising from the consumption of Allgyer’s milk. In fact, raw milk in Pennsylvania is already highly regulated.  More than 110 farms in Pennsylvania have raw milk permits that are only maintained via regular and rigorous testing for the kinds of bacteria that cause foodborne illness. As Adam Helfer of the Washington Times pointed out,

 

“The confusion seems to arise from the FDA not understanding and differentiating between conventional milk (which needs to be pasteurized for safety) and raw milk from healthy, pastured animals and clean conditions. It is to be noted that grass-fed raw milk has been consumed safely by cultures for thousands of years.”

 

Cows fed on grass, their natural food, and raised in their traditional environment (open pasture) with plenty of space to graze, are consistently healthy, unlike their factory-farmed, grain-fed counterparts. As a result, their milk is not only more nutritious, but contains significant quantities of beneficial bacteria and enzymes, which protect the milk from pathogenic bacteria. Instead of putrefying, raw milk from healthy cows simply sours over time, as the beneficial bacteria proliferate, and ultimately turns into buttermilk, yogurt and cheese. While raw milk from factory farmed cows would be very risky to drink, raw milk from healthy cows is practically impossible to contaminate, and does not need to be pasteurized.

It all comes down to the question of whether the cows are healthy, and closely monitored – standards which are easily achieved on a small family farm like Dan Allgyer’s.  If these conditions are met, raw milk is vastly superior to pasteurized (whether organic or factory farmed – though organic and pasteurized is superior to factory farmed and pasteurized), both in terms of nutrition and taste, which is why growing numbers of people are choosing to purchase it. The FDA, in ignoring this distinction, conflates all raw milk as equally dangerous, regardless of the cow it came from. Consequently, the FDA considers it necessary to take away our freedom to purchase raw milk.

In an attempt to be generous to the FDA, one could say that they are being busybodies only out of a sincere desire to protect our health. They may be trying to control what we can eat and drink, but at least it is with our best interests at heart. However, not only does the FDA permit the sale of cigarettes and alcohol – both of which, if consumed too frequently, are actual health hazards, unlike raw milk from grass-fed cows – the FDA even overlooks the dangers it has admittedly identified in pasteurized milk.

In January of this year, the New York Times reported that the FDA, each year, finds illegal levels of antibiotics in older dairy cows that are destined for the slaughterhouse. The big dairy companies regularly dose their cows with antibiotics because the factory-farm conditions in which these cows live are so unhealthy – and the cows’ diets are so poor – that they are sick almost every day of their lives.  Since it stood to reason that the dangerously high levels of antibiotics the FDA found might be in the dairy cows even while they are producing milk for human consumption, the FDA was considering testing the milk from the large dairy farms that were the sources of the high-antibiotic cows destined for the slaughterhouse.

However, the FDA’s proposal met with strong resistance from the pasteurized dairy industry. Why? Ostensibly because the testing would take long enough that milk from the cows being tested would have to be put on the market in the meantime. And if the milk turned out to be contaminated with antibiotics, it would then have to be recalled, costing dairy producers millions and harming their reputations. To quote from the New York Times article,

 

“What has been served up, up to this point, by Food and Drug has been potentially very damaging to innocent dairy farmers,” said John J. Wilson, a senior vice president for Dairy Farmers of America, the nation’s largest dairy cooperative. He said that that the nation’s milk was safe and that there was little reason to think that the slaughterhouse findings would be replicated in tests of the milk supply.

 

The danger to us, of course, is that by consuming too many antibiotics in milk, we could not only weaken our own immune systems but also further the evolution of drug resistant strains of bacteria. In fact, one impetus for the new testing is that the antibiotics for which the FDA currently tests are no longer the only ones in use by dairy farmers. Why are so many new antibiotics being used? Because the most common ones are losing their effectiveness as those drug resistant strains of bacteria develop. If there was ever a situation for the FDA to step in this was it. Unfortunately, all the dairy industry had to do was send a “sharply worded letter” to the FDA to get them to withdraw their testing plan for indefinite review.  That means anyone who is drinking conventional pasteurized milk may be drinking a product too dangerous to be on store shelves.

So is the FDA really looking out for us? Or are they just looking for easy targets? It seems that any segment of the food industry that is large and influential is safe from oversight, but a single Amish farmer working hard to provide the highest quality of milk to his small group of buyers is Public Enemy #1. Perhaps the fact that his business is a threat to the big dairy industry is the real reason why scrutiny is on him.

What does the future hold? Marylanders, and residents of other states, who would like to choose raw milk, may have fewer and fewer options.  Congressman Ron Paul has introduced a bill, HR 1830, that would allow the shipment and distribution of unpasteurized milk and milk products for human consumption across state lines; however, the bill is unlikely to pass.  Conventional pasteurized milk will continue to dominate the market for the foreseeable future, and because it achieves its artificially low prices based on unnatural factory farming (helped out by government subsidies on the grains and soybeans it feeds the cows), it will continue to need to pump its animals full of antibiotics just to keep them alive. Those antibiotics will also continue to give rise to new strains of drug-resistant pathogens. There’s one in particular that you might be reading about in the news lately: E. coli O104:H4.

E. coli is a bacterium commonly found in the intestines of warm-blooded animals.  Most strains of E. coli are harmless, even beneficial, contributing to the flora of the gut, but a few (such as serotype O157:H7) produce shiga toxin, which causes hemorrhagic diarrhea and kidney failure.  The current outbreak in Germany is being caused by shiga-toxin producing strain O104:H4 – a new strain that resists more than a dozen commonly used antibiotics, making illnesses caused by it extremely difficult to treat. In the span of a month, it sickened over 3,000 people and killed 36.  European public health officials, desperately seeking the immediate source of the bacteria, first incorrectly guessed it was cucumbers and other raw vegetables imported from Spain; now they are fairly confident it was sprouts from an organic farm in northern Germany. How E. coli O104:H4 got into the sprouts in the first place has not yet been determined.  But outbreaks of shiga-toxin producing E. coli (STECs for short), which have been in existence for less than thirty years, almost always have their ultimate origin in cattle.

A recent article in Bloomberg News quotes Australian veterinary public health researcher Rowland Cobbold as saying that “Cattle are the main reservoir for E. coli, the family of bowel-dwelling bacteria from which the new bug comes…The cucumber [or other raw vegetable] may be the lead back to the original ruminant that was the source…It’s almost entirely likely that it came from cattle at some point.” From the article:

 

Outbreaks of bloody diarrhea caused by E. coli have usually been linked to contaminated meat, Cobbold said. In more recent outbreaks where fruit and vegetables were implicated, E. coli- contaminated manure or irrigation water were found to be the original source, he said.

“If this goes the same way as previous investigations, they’ll find the ‘smoking gun’ — the ‘smoking tomato’ or the ‘smoking cucumber’,” Cobbold said. “They will then follow the production source back to the farm and they’ll work out the various contamination roots.” Most likely that will lead to the “smoking cow,” or at least a specific herd where the strain can be found, he said.

 

Intestines, or fecal matter from the hide of a cow in a slaughterhouse, can mix with meat going into ground beef. E. coli in manure can also spread into nearby fields and water sources and thereby get into vegetables.  As a consequence of the latest outbreak, many are now calling for irradiation of our entire food supply (essentially, pasteurization of cucumbers and lettuce), another solution which would enable food producers to skip quality assurance, and which is likely to give rise to new types of health crises, just as the current system of industrial agriculture has done.

The capacity for pathogenic bacteria to spread continent-wide from a single farm or animal is one of the flaws of our global food system.  But the real issue is what gives rise to a bacterium like E. coli 0104:H4 in the first place: over-usage of antibiotics, the very issue which the FDA, despite its stated mission to protect our health, is hesitating to address. While it’s frightening that similar outbreaks in the future are almost inevitable, it’s deeply ironic that the FDA is busy attacking the very type of farm that, by raising healthy cows in a natural, small, easily monitored environment, and selling its products directly to its local customers, is designed to prevent such global catastrophes

We are not in any danger from farms like Dan Allgyer’s. But we have reason to fear that factory farming, with its unhealthy cows full of antibiotics, will lead to new strains of E.coli that have to potential to repeatedly contaminate our global food system.  If the FDA were serious about limiting the spread of food borne pathogens, it would go to the source of the problem – the poor diet and unhygienic living conditions of the animals who are the initial victims of industrial, factory farmed agriculture.  At the same time, it would leave in peace those who are choosing to bypass the industrial food system for a local system that’s safer, healthier, and more accountable.

 

Vaccination, an Overview, Part 5: Two Approaches to Vaccination

The lack of knowledge regarding the primary cause of the epidemics of disease in the previous centuries – that is, malnutrition and exhaustion leading to weak health – can be attributed to the medical establishment’s general emphasis on the symptoms, rather than the causes, of illness. This emphasis is not a feature solely of modern medicine or even of mainstream medicine. Throughout history, both mainstream doctors and alternative health practitioners have sought to provide us with “quick fixes” and convenient solutions to the symptoms of problems initially caused by poor diet and lifestyle, rather than recommending a diet and lifestyle that would help prevent such problems in the first place.

The first flaw in this approach is that the medical interventions intended to eliminate our symptoms invariably have significant side effects of their own, which then require further interventions with additional side effects. A second flaw is that in situations where such interventions are not available, if we have not been taught how to take care of our health, we are entirely at the mercy of disease.

            We can think of suffering a severe reaction to an infectious disease as a “symptom” of being in poor health in the first place. However, the medical establishment, by minimizing the role played by a healthy immune system, implies that we are all equally defenseless against pathogens. The consequences of this approach are the ever-increasing vaccine schedule and also our society’s “germophobia.” In fairness, in recent decades, acknowledgement of the importance of diet and lifestyle on healthy immune function, and of the abilities of the immune system itself, has increased. But when flu season strikes, the message we hear is primarily to get vaccinated and to minimize the spreading of germs. Just as is frequently the case with diet, we try to eliminate what’s bad but we don’t try to replace it with what’s good.

In taking the vaccinate-and-sanitize approach to fighting disease, rather than an approach that promotes health, we may simply have shifted the battleground. Part 3 already discussed the real and potential debilitating side effects of the vaccines we administer to our children and ourselves.  As for sanitization, while it can eliminate potentially harmful germs from the environment, it often does so by means of toxic chemicals. And lack of exposure to germs is likely to result in oversensitive immune systems that will react negatively to pollen and commonly eaten foods.

In accordance with the second flaw of the symptom-focused approach, children are left vulnerable in instances where vaccines do not succeed in providing immunity, where they are unavailable, or where they have not been invented. And when unhealthy children enter  germ-laden environments, as is inevitable given that they spend more time in the doctor’s office or hospital, they are at serious risk. Consider that in the United States an estimated 80,000 individuals per year die from infections received while in hospitals, most frequently due to catheters, which is not unsurprising given that hospital workers only wash their hands about 70% of the time. While our general environment, and our hospitals as well, are far more hygienic now than they were 150 years ago, and we are generally better nourished as well, we still have a long way to go.

As a short-term strategy, vaccinate-and-sanitize saves lives. But like most approaches that only seek to address the symptoms, not the causes of our health problems, it inevitably results in new, mysterious health problems. At best, focusing on vaccines largely maintains the status quo.  The Bill & Melinda Gates foundation is an excellent example. The world’s largest private foundation, it possesses an endowment of $33 billion, and devotes a significant portion of its resources to improving global health. Vaccination and medication programs in third-world countries are the primary beneficiaries of these resources. However, after decades of giving, vaccine-preventable and other diseases still persist in these countries, and the individuals receiving vaccines and medications often lack basic needs such as nutrition, clean water, and transportation. If they can even access the medications, they may not have enough food to digest them. The wealth of the foundation is ultimately directed to the already wealthy pharmaceutical countries, while the residents of third world countries remain malnourished and impoverished.

In the long run, symptom-focused strategies tend to benefit more those who promote them than those who are subject to them.

A better way to handle the threat of infectious disease would be to create the conditions for healthy, strong immune systems in children.  As discussed in Part 4, these conditions include eating a diet based on whole foods (and breast milk in the case of infants); drinking clean water as the primary beverage; getting enough rest and enough exercise; and reducing stress. Like a muscle, the immune system must also be exercised in order to be strong.  Natural “vaccination,” or technically, immunization, can occur when we are exposed from a young age to a wide variety of microbes in raw and fermented foods, in breast milk, even in dirt. It is a process not too different from that by which Benjamin Jesty’s dairy workers were naturally protected from smallpox.  In fact, a healthy child’s natural exposure to more mild infectious diseases, such as chickenpox, may be beneficial for healthy immune development. A well-nourished child with a well-exercised immune system is strongly equipped to handle the pathogens he or she is likely to encounter, and is likely to be one of the vast majority of children who do not suffer severe reactions to more serious diseases such as measles, or even polio or diphtheria.  A child who, in contrast, is raised on a diet of largely processed food, with little exposure to beneficial bacteria, has a sedentary lifestyle, and suffers from frequent colds and ear infections which are treated with antibiotics instead of fought by the immune system, is more likely to have a severe reaction to a strong pathogen and, in the absence of a change in diet and lifestyle, would probably benefit from the protection of most vaccines, despite their potential side effects.

Unfortunately, the well-nourished child is far more rarely seen in our society than his or her conventionally-nourished counterpart. The medical establishment has, for the most part, chosen neither to study nor promote practices that make us more healthy and consequently less dependent upon vaccines, medications, supplements, sanitizers, and surgery.  Most doctors acknowledge that whole foods are better than processed foods, breast milk superior to infant formula, and some exercise better than none. Some may even recommend playing in the dirt over sitting inside all day. But junk foods continue to infiltrate our schools just as recess programs disappear from them. Infant formula is pushed on women whose babies are not growing at rates arbitrarily determined to be acceptable. Many people consider it inconvenient to try and be healthy or to allow the immune system to fight disease, which is the primary reason why a chickenpox vaccine, for example, was invented. As long as our attitude towards health is symptom-focused, and values short-term convenience over long-term wellness, our health will remain vulnerable and our need for medical interventions will grow.

In the late 19th century, during the worst epidemics of disease, the author and social theorist Leo Tolstoy wrote, in an essay entitled Modern Science:  “The defenders of present-day science seem to think that the curing of one child from diphtheria, among those Russian children of whom 50% (and even 80% in the Foundling Hospitals) die as a regular thing apart from diphtheria must convince anyone of the beneficence of science in general…our life is so arranged that from bad food, excessive and harmful work, bad dwellings and clothes, or from want, not children only, but a majority of people, die before they have lived half the years that should be theirs…And, in proof of the fruitfulness of science, we are told that it cures one in a thousand of the sick, who are sick only because science has neglected its proper business.” According to Tolstoy, though science was an incredibly valuable tool, when misdirected it did not benefit humanity. To be beneficial to us, medical science must be guided by wisdom and foresight, rather than shortsightedness, and should possess a healthy respect for and inquisitiveness into the capacities of the healthy human body.

The ultimate question of whether and how to vaccinate your child is a difficult one, and the answer is not the same for everyone. There is no utterly risk-free approach to take; even the healthiest person can still succumb to a powerful pathogen, as can the most thoroughly vaccinated person. Your decision must involve an awareness of your child’s likely susceptibility to each disease against which we vaccinate, and a calculation of the benefits of the vaccine against the risks of its possible side effects. Regardless of your decision, however, the best thing you can do for your children is to make the necessary changes in your diet and lifestyle for the promotion of health. For further reading on the risks and benefits of vaccines as well as strategies for strengthening the immune system, I suggest you consult one or more of the books listed below.

 

Vaccinations: A Thoughtful Parent’s Guide by Aviva Jill Romm (2001). Discusses vaccines from a historical perspective and contains natural and herbal remedies for common childhood diseases as well as recommendations for building immunity naturally. Romm is a certified professional midwife, a practicing herbalist, and a physician.

 

What Your Doctor May Not Tell You About Children’s Vaccinations by Stephanie Cave, M.D. (2001). Explores the possible relationships between vaccines and autoimmune diseases/developmental disorders, and contains an overview of vaccines and the legal issues related to them, as well as an alternative vaccine schedule.

 

The Vaccine Book by Robert Sears, M.D. (2007). Contains a detailed guide to the current vaccine schedule, including a discussion of the severity and rarity of each disease and the ingredients and side effects of each vaccine. Also contains an alternative vaccine schedule.

 

The Vaccine Guide by Randall Neustaedter, O.M.D. (1996, 2002). Provides an extensive and technical overview of research on the safety of vaccines and the results of that research.

 

How to Raise a Healthy Child in Spite of Your Doctor by Robert Mendelsohn, M.D. (1984). Covers the most common childhood ailments and the appropriate treatments for them. Also contains a section on diseases commonly vaccinated against, their severity, and the effectiveness of the vaccines.

 

Vaccination, an Overview, Part 4 Building Immunity

As discussed in Part 1, vaccines are designed to stimulate the immune system. In fact, their effectiveness really comes from triggering the body’s own natural processes of adaptive immunity. The underlying assumption of vaccination is that the immune system is inherently not likely to be strong enough to handle a disease when it encounters it in nature, which is why we need vaccines to safely and artificially engineer the encounter with disease. This assumption, which has its origin in the aforementioned germ theory of disease, is perhaps an understandable one. In the late 19th century, scientists had observed epidemic after epidemic of infectious disease resulting in millions of casualties. It was reasonable in them to consider that the pathogens they had discovered were indiscriminately deadly. However, one scientist of the era, a French biologist named Antoine Bechamp, had a different proposition for why people were succumbing to infectious disease at great rates: he pinned it on their weak health.

Bechamp, a contemporary of and influence upon Pasteur, would have agreed with Pasteur’s arguments that methods of sanitization (such as hand-washing and pasteurization) would prevent the spread of disease by eliminating pathogens from the local environment. However, Bechamp’s theory was that most people who suffered from infectious disease did so because their own bodies were, in a sense, “unsanitized,” but on a cellular level. According to Bechamp, when we are in a diminished state of health, our cells and tissues form a breeding ground for microorganisms (or microzymas as he called them) that are largely already present in our bodies, but which do not take on a harmful form or reach harmful levels without the supportive environment provided by an sick individual.  Bechamp’s theory formed a contrast to an interpretation of the germ theory that identified external pathogens as being the sole and direct cause of infectious disease, regardless of the prior health of the diseased person.

Meanwhile, Robert Koch, a contemporary of Bechamp and Pasteur, had formulated four postulates meant to establish a causal relationship between a unique pathogen and a unique disease. The postulates specified the following: (1) the pathogen should be in all organisms suffering from the disease, but not in healthy individuals; (2) it must be possible to isolate the pathogen and grow it in a pure culture; (3) it must cause the disease when introduced into a healthy organism; and (4) it must then be possible to re-isolate it from the inoculated organism and found identical to the original. Koch later had to change the first postulate after finding cases of healthy, asymptomatic organisms carrying the bacteria that cause cholera and typhoid fever, respectively. He also had to change the third postulate after finding that not all organisms exposed to a pathogen will display symptoms of infection.

Koch’s findings indicated that both Bechamp’s and Pasteur’s theories had some merit. Pasteur’s disease-centered approach, which relied on sterilization, pasteurization, quarantine, and sanitation, was focused on preventing the spread of disease by eliminating the pathogen from the external environment. Bechamp’s health-centered approached was based on making the individual stronger and healthier, and thereby better able to prevent pathogens from gaining a foothold within the environment of the human body. Although the specific mechanism of Bechamp’s theory – that of microzymas arising from our own tissues to form pathogens – has never been proven, scientists have since discovered that our health plays a tremendous role in the effective functioning of our immune systems, and consequently affects how easily we succumb to infections.

The human immune system is a conglomerate of many different body parts and processes. The skin, liver, kidneys, respiratory tract, intestinal flora and more all play a role in destroying pathogens by means of inflammation, white blood cells, and antibacterial or antiviral chemicals and enzymes.  Those pathogens are discharged via the cleansing and flushing action of tears, urine, mucus and diarrhea. The cells that form the adaptive part of the immune system are able to retain memories of specific pathogens and thereby easily neutralize those pathogens with antibodies upon future encounters.

All systems, mechanical or biological, cannot work properly unless supplied with the proper fuel or raw materials.  A car cannot run without fuel, nor an ecosystem without water, oxygen, and sun.  Our immune system is no different; in order to function, it must be providedwith needed nutrients. Vitamin D, the antioxidant vitamins A, C, and E, Vitamin B6, folic acid, and the minerals zinc, copper, iron selenium have all been found to be vital for promoting the health of the immune system, as have the beneficial bacteria contained in raw fermented foods. Other nutrients contained in whole foods that are as yet unstudied or even undiscovered may be similarly essential. Where infants are concerned, breast milk provides, in addition to needed nutrients, a variety of immunologic factors such as immunoglobulins (antibodies), the enzymes lysozyme and lactoferrin, and lymphocytes (white blood cells). These ingredients promote not only the health but also the growth and strengthening of the infant immune system and protects against the routine infections that are much more commonly seen today in babies that are fed formula, which does not contain immunological factors. Along with nutrition, people need a certain amount of rest and sleep, as well as moderate exercise and clean water, in order to maintain healthy immune function. Stress, extreme conditions, exhaustion and dehydration all weaken the immune system, and they also weaken a nursing mother’s ability to provide nourishing milk.

The scientists who were formulating the germ theory of disease in the late 19th century were living during the tail-end of the Industrial Revolution, a period of enormous social, political and technological change. Economies in Europe and America had shifted from an emphasis on rural agriculture to one on urban industry.  In England in 1750, only 15% of people were living in urban areas, but by 1860, that number had risen to 80%.  Within the cities, the lower classes (both children and adults) had started working long hours in factories for little pay, often doing heavy labor in extreme conditions. As a consequence they were frequently exhausted and, as a result of their poverty, malnourished.  The upper classes, on the other hand, deliberately chose to eat newly available refined foods that were low in nutrients and high in calories, and many women did not get enough exercise or sunlight. They too were weak and sickly and prone to death in childbirth. In sum, the majority of people living during this era were in poor health, with low functioning immune systems, and thereby had reduced resistance to disease.

At the same time, the cities to which so many had relocated lacked the proper waste disposal systems for handling such large populations. Consequently, pathogens were able to contaminate the air, water, food and the streets.  Technology had developed in such a way as to ease the transmission of infectious disease without yet possessing a means to  prevent it. Doctors themselves were some of the worst transmitters. Yet unaware of the need to wash their hands (in many cases outright rejecting the idea), they easily spread fatal pathogens to the many patients, particularly mothers in childbirth, whom they treated in busy urban hospitals.  It is no surprise that infectious diseases ran rampant and that infant mortality was around 40% on average, with the highest rates occurring in the cities.

With the formulation of the germ theory of disease, sanitary practices such as Pasteur and the physician Ignaz Semmelweiss proposed were grudgingly accepted by physicians, with positive results. However, the theory ultimately focused much more on the danger of microbes than the ability of the healthy human body to resist them. As a result, scientists and government officials complemented sanitary practices by arguing the need for vaccines, rather than following Bechamp’s lead in promoting a healthy diet and lifestyle that would simply strengthen the immune system.

Fortunately, due to the explosion of nutrient deficiency diseases during the same period of time, vitamins were gradually discovered and added back into the processed foods from which they had recently been removed. This resulted in better nutrition, which, together with advances in sanitation technology, greatly improved overall health and hygiene in Europe and America following the turn of the century, though the conditions for epidemics were still occasionally created by destabilizing events such as World War 1 and the Great Depression. Smallpox, cholera, tuberculosis, diphtheria, scarlet fever, typhoid fever and other infectious diseases all began to decrease, whether vaccines had been developed for them or not. Endemic diseases like measles, mumps, rubella and chickenpox persisted, but were far less likely than before to cause complications or fatalities.

The only disease to cause epidemics in the developed world into the mid-20th century was polio. Polio, like many of the epidemic diseases of the time, had been around for thousands of years without ever being responsible for major epidemics prior to the late 19thcentury. Since 90% of polio infections cause no symptoms at all, deaths and paralysis from polio were rare. All that changed with the onset of the industrial revolution and the weakened health of the population; suddenly, polio could spread easily, and it met with little resistance in its victims. As sanitation began to improve, fewer people were exposed to the polio virus as young children, when they are least likely to suffer harm from it, and when they can acquire long-term immunity. But while the number of people exposed to polio was lessened, the number of those who died or suffered paralysis increased, since those who did not develop immunity as children encountered it as teenagers or adults, when the disease is more severe.

Additional factors in the severity of the polio epidemic were the rising fads of formula feeding and the tonsillectomy procedure. By 1950 over half of all babies were being fed infant formula (lacking polio antibodies, naturally), which was being promoted as better than breast milk by now-discredited scientific studies. It was around this time as well that performing tonsillectomies became a fad among surgeons and doctors, and in the 1930s and 1940s between 1.5 and 2 million tonsillectomies were being performed each year. The tonsils are glands that aid the immune system by blocking pathogens; when they were inflamed, it was a sign they were hard at work. These tissues formed the first line of defense against ingested or inhaled pathogens, such as polio.  Since polio was not as stymied by better sanitation as other diseases, it was able take advantage of the weakened immune systems of older children and adults. Still, polio, like most other infectious diseases, continued its downward trend of incidence prior to the introduction of its vaccine.

As the historical evidence indicates, vaccines simply speeded an already-occurring disappearance of infectious diseases in developed countries. Without advances in nutrition to ensure basic immune system function, and sanitation to prevent the spread of pathogens, infectious disease would probably have persisted despite vaccination.  Tuberculosis is a good example. During parts of the 19th century it was responsible for one quarter of all deaths in Europe. It no longer troubles the developed world, despite the fact that we never effectively vaccinated against it in America or Europe.  However, it still causes between 1.5 and 2 million deaths per year in impoverished countries whose citizens have poor health and sanitation, despite widespread vaccination in such countries.

In conclusion, while vaccines do possess varying degrees of effectiveness, and can help to reduce the incidence of disease, they are not our most important form of protection against disease.   As Bechamp theorized, the explosion of infectious disease in the 19thcentury was really due to a relatively brief, but steep, reduction in general health, which, when paired with unsanitary living conditions, made epidemics inevitable. Our strategy for acquiring better immunity to all diseases, or providing the conditions for such immunity to our children, should primarily be to maintain good nutrition and health through breastfeeding, consumption of natural whole foods, clean water, regular rest, regular exercise, and reduction of stress.

 

In next week’s article, entitled “Two Approaches to Vaccination,” I’ll discuss the underlying worldview behind the modern-day vaccine schedule and contrast it with a more holistic approach to public health.

Vaccination, an Overview, Part 3 The New Epidemic

In America today, what infectious diseases remain, such as the flu, are not as life-threatening, and infant mortality has drastically decreased from just a century ago. Children of today are highly likely to make it to adulthood. Coinciding with the reduction of infectious disease, however, has been a corresponding emergence of an entirely new kind of health problem in children: chronic disease. Children in ever greater numbers are suffering from immune system disorders and developmental delays which have no known cause or cure. Eczema, hives, hay fever and food sensitivities have been increasing since the 1920s, with rapid surges occurring in the 1960s and the 1980s, and these allergies now occur in the tens of millions. Asthma has been increasing since the 1960s, particularly in developed countries, and it now affects 6 million children in the U.S. Attention-Deficit Hyperactivity Disorder has tripled in incidence since the 1970s. Autism spectrum disorders have grown from 1 in 2,000 in the 1960s and 1970s to 1 in 150 today, with the greatest spike occurring from 1996 to 2007.   All of these increases in incidence are too great to be explained solely by genetic mutations (although genetic susceptibility does seem to be a factor) or by evolving diagnostic methods and definitions.  Consequently, an external, environmental agent (or agents) must be triggering them. Since these diseases are chronic but seem to be unassociated with any pathogen and not infectious, they cannot be explained by the germ theory of disease, and scientists possess no alternative theory that would explain what in our environment could be triggering these types of health problems.

It is worth noting that our environment has changed drastically over the last half-century. Our food, water and air are less likely to be contaminated by bacteria like tuberculosis or cholera but are more likely to contain pesticides and other potentially toxic chemicals. Children who used to run and play outdoors, using up their excess energy and exposing their immune systems to many different natural substances, from pollen to poison ivy, now spend most of their time indoors in school or sitting still in front of a screen at home. At the same time they have adopted diets high in excess calories and low in nutrients. Antibiotics and pasteurization have reduced the presence of both bad and good bacteria in their lives. This new lifestyle could be the culprit for children’s hypersensitive immune systems and hyperactive behavior, or it could at least be a contributor. When it comes to autism spectrum disorders, however, many parents believe that vaccines play a major role.

Vaccines have never been completely without side effects, and even the safest vaccines will cause temporary side effects (such as pain and swelling, fever, vomiting, diarrhea, rashes, headaches and crying) between 5% and 40% of the time.  Serious side effects are usually some form of inflammation: Guillain-Barre syndrome  (an autoimmune disorder causing paralysis) and encephalitis (inflammation of the brain).  However, these are said to be extremely rare. A vaccine for which the serious side effects were found to be relatively more common was the first combination vaccine, DTP (diphtheria, tetanus and pertussis), which was released on the market in 1946. In the 1970s and 1980s there was a growing awareness that the pertussis portion of the vaccine, which used a whole-cell B. pertussis bacteria, was responsible for a higher-than-expected rate of reactions such as convulsions, shock, cardiac distress and brain damage. In 1981 Japanese scientists developed a new vaccine that used a safer acellular pertussis component, and caused far fewer reactions, but this form of the vaccine was only adopted in the United States in 1996, after many years of lobbying by parents who had observed their children react adversely to the DTP vaccine.

As was the case with the DTP vaccine, suspicions of a link between autism and vaccines have their initial basis in the case reports of parents who see their children lose previously acquired mental and social skills following doses of vaccines, the majority of which are administered in the first two years of life, the same timespan in which autism usually appears. This correlation could be explained as a coincidence, but the issue is complicated by the fact the rates of autism have increased in conjunction with rising number of shots given to children. In 1983, for example, children received vaccines for diphtheria, tetanus, pertussis (given together as DTP), polio, and mumps, measles and rubella (given together as MMR). This schedule represented vaccines for 7 diseases in the first 4 years. There were 6 shots containing 18 doses of vaccines plus an additional 4 doses of the oral polio vaccine, totaling 22 doses of vaccines. In the year 1995 the schedule was largely the same, except for the addition of the vaccine against Haemophilus Influenzae Type B (HIB) a bacteria that causes meningitis. After that, however, the number of vaccines began to increase.  By 2007, children following the standard schedule were receiving 40 total doses of vaccines against 14 diseases, double what had been given a decade previously. At the same time the number of shots did not greatly increase, because new combination vaccines became available that combine four or five vaccines into one shot. The result has been a significant increase in the amount of foreign material injected into a child’s body at one time.

As discussed in last week’s newsletter, the ingredients of a vaccine must be carefully balanced and formulated in order for the vaccine to be both safe and effective. The typical vaccine components mentioned in the first section – the pathogen, the tissues in which it is cultured, an adjuvant to help stimulate immunity, and a preservative to protect the vaccine from additional pathogens – are each capable of causing unwanted side effects.Live viruses and bacteria, found in the DPT and MMR vaccines among others, are better able to stimulate immunity, but are more likely than weak or killed pathogens to cause a persistent infection and excessive inflammation, including inflammation of the brain (encephalitis) and subsequent brain damage.  Animal or human tissues in which pathogens are cultured contain proteins similar to those contained in our own tissues.  In reacting to the pathogen in a vaccine, some immune systems may see these proteins as part of the threat, and produce autoantibodies against them. These autoantibodies can’t tell the difference between the injected proteins and body’s proteins, resulting in chronic inflammatory autoimmune disease such as Guillain-Barre syndrome, arthritis or multiple sclerosis. The most typical adjuvant in vaccines, aluminum, is a metal that has been linked to Alzheimer’s disease, dementia and brain damage, and it may be difficult for some children to detoxify. As for preservatives, some vaccines contain formaldehyde, a carcinogen, and most vaccines previously contained thiomersal, a form of mercury, before vaccine manufacturers agreed to provide mercury-free vaccines upon request several years ago. Could these ingredients, as they are injected into children with greater frequency and in greater quantities, be responsible for the increasing incidence of chronic immune hypersensivity and developmental disorders in children?  Clearly, not all children have negative long-term reactions to vaccines; in fact, it seems that most of them don’t.  But might some children have a genetic susceptibility to having adverse reactions to vaccines, particularly when administered according to the current schedule?

What are the facts of the situation? First, vaccines carry the potential for adverse effects, including brain damage.  Second, there is a parallel between increasing autism rates and the increased number of vaccines given.  Last, autism typically emerges in children during the period of time when vaccines are administered.  What have we proved?  Nothing.  These facts are not proof of a causal relationship between vaccines and autism–they only show a correlation.  However, this correlation makes a causal relationship a possibility worth investigating, especially since no other cause of autism has been identified. Accordingly, many scientific studies have been done on whether a link between vaccines and autism exists. The initial safety studies done on each new vaccine by Merck, Sanofi Pasteur, Wyeth, and GlaxoSmithKline (the four large pharmaceutical companies that manufacture almost all vaccines), the results of which are reviewed by the FDA and the CDC’s Vaccine Adverse Events Reporting System (VAERS), have not found a link for any individual vaccine. Doctors and research scientists, most notably the independent, non-profit Institute of Medicine, have conducted many additional studies over the past two decades, as well as comprehensive reviews of earlier research, and the vast majority of them have also concluded that no link can be proven, thus confirming the scientific consensus that the serious side effects of vaccines are extremely rare and do not include autism.

The most famous study that did hint at a possible connection between vaccines and autism was published in 1998 in The Lancet, a British medical journal that is perhaps the most respected in the world. The lead author, Dr. Andrew Wakefield, and twelve of his colleagues, argued, based on observations of twelve children with both inflammatory bowel disease and autism, that the children might have a new syndrome caused by the vaccine-strain measles virus, which was found in their intestines. Because the children were previously normal, Dr. Wakefield suggested an environmental trigger might be the cause of the syndrome, and called for the MMR vaccine (measles-mumps-rubella combination) to be discontinued in favor of separate vaccines administered at separate times, until more research could be done. However, the British government felt that to do so would increase the exposure of children to the three diseases. The results of the study were widely reported in the news media, and with MMR remaining the only vaccine available, many parents did not vaccinate their children against the diseases at all.

In the years that followed, both Wakefield and the study received increasing criticism. Other scientists did similar studies and reported that they had failed to duplicate the results. A journalist investigating Wakefield found that he had ties to a lawyer preparing a lawsuit against the MMR manufacturers, and that he had a patent on a new measles vaccine, both indicative of serious conflicts of interest. Ten of the twelve co-authors eventually disowned the paper. Earlier this year, The Lancet itself finally retracted the paper, and Dr. Wakefield lost his license to practice medicine in the UK.

In light of this evidence, it would seem that the possibility of any link between vaccines and autism has been thoroughly eliminated. But for a variety of reasons, we must question the credibility of those who signed off on vaccine safety, who authored and reviewed pro-vaccine studies, and who have promoted vaccines in the media. To begin with, the general public has long had good reason to distrust the ethics and integrity of the pharmaceutical industry, which has been known to disguise or minimize knowledge of adverse reactions to its products (such as Avandia, Vioxx and Fen-Phen). It has also been known to aggressively market its products to as wide a customer base as possible — even urging in recent months, with governmental approval, cholesterol-lowering drugs on people who do not even have high cholesterol. Vaccines are a guaranteed lucrative investment, given that they are prescribed equally to almost every individual in the country.

An additional strike against the pharmaceutical companies’ assurances of safety is that they are not responsible for adverse side effects of the vaccines they manufacture. In the 1980s, as more parents whose children had been injured by the DPT vaccine began to bring lawsuits against vaccine manufacturers, those manufacturers threatened to stop making vaccines entirely, reasoning that it would be unprofitable to continue if they had to pay expensive personal injury claims. In order to ensure that vaccines remained available to the public, the U.S. government stepped in and passed the National Childhood Vaccine Injury Act, which set up a special government court for hearing vaccine injury claims, and awarding damages up to $250,000.  The damages are funded by proceeds from a tax on vaccines, thus shielding vaccine manufacturers from any financial liability. Claims are argued before a government-appointed judge rather than a jury, and while most claims are rejected, the court has had to award almost $2 billion in damages since its inception.

Clearly, pharmaceutical companies manufacture vaccines for profit, not out of an overriding concern for the safety of children. It is not likely that they would abandon profitable products such as vaccines even if they knew that such products caused relatively frequent and severe side effects–just as they knew, but kept secret, the fact that Avandia increased the risk of heart attacks, for example. It is therefore prudent not to accept at face value claims (and by claims, I mean advertising) by the vaccine manufacturers, and by the scientists whom they employ, that vaccines are extremely safe.

What about the government’s independent oversight and regulatory authority? Unfortunately, as in so many industries (including banking, energy, and health care) a revolving door of employment exists between the pharmaceutical companies and the federal authorities that regulate them. An example is Dr. Julie Gerberding, who directed the CDC from 1998 to 2009. This was the period during which the number of vaccines administered and the number of autism cases greatly increased. Dr. Gerberding waited exactly one year and one day after leaving the CDC – the legal minimum – before taking on the job of President of the Vaccine Division of Merck Pharmaceuticals. Gerberding, during her CDC tenure, heavily promoted Merck’s new-to-the-market HPV vaccine, Gardasil, as well as the safety and effectiveness of vaccines in general.

As for scientists and medical doctors who conduct research on the safety of vaccines, many rely on the financial support of the pharmaceutical companies to carry out their research.  Without that support, they would be unable to carry out wide-ranging, long-lasting epidemiological studies of vaccine reactions. The most vocal and media-friendly proponent of vaccine safety, Dr. Paul Offit of the Children’s Hospital in Philadelphia, happens to be the co-inventor of the Rotavirus vaccine RotaTeq (also manufactured by Merck).   Offit has received royalties totaling $182 million from RotaTeq alone.

The conflicts of interest described so far have their origin in greed, but some conflicts can arise from humanitarian motivations. Most public health officials have concerns that if doubts about vaccine safety are given a more thorough hearing, a majority of parents might choose to vaccinate their children less, or not at all (as we saw happen in the aftermath of the Wakefield study publication) and consequently return us to an era of epidemic disease rivaling that of the 19th and early 20th centuries. The authorities may be unwilling to give a fair hearing to the possibility of a vaccine-autism link to serve the greater good. It’s possible that, even if Dr. Wakefield was partly right in his conclusions, the government and scientific community may have been driven by these types of fears to dissect his work for errors and conflicts and to magnify those flaws.

With so many powerful institutions – pharmaceutical companies, government, and scientific bodies – motivated for a variety of reasons to disprove a link between vaccines and autism, it is unlikely that any individual scientist or pediatrician is willing to stake their reputation, potentially even their license to practice medicine, by publishing (or even conducting) a study indicating greater-than-reported side effects of vaccines.  Not only would funding for such a study be difficult to obtain, any flaws in its methodology will be far more heavily scrutinized than if it were to confirm what has already been promoted as scientific truth.

If so many conflicts of interest are at work, shouldn’t we expect to see weaknesses in the pro-vaccine studies?  In fact, on closer examination, many of the studies showing that vaccines are unrelated to autism have significant methodological flaws or are reported to have broader conclusions than they really do. To take a recent example, an epidemiological study by researchers from the University of Louisville School of Medicine was published in Pediatrics magazine on May 24th of this year, stating that giving children vaccines on schedule had no negative effect on long-term neurodevelopment. Most news outlets reported that the study had shattered the “myth” that a delayed or alternative vaccine schedule was safer than the standard, CDC-recommended schedule. However, the study was based on data from a 2007 study published in the New England Journal of Medicine intended to determine whether increased amounts of thiomersal in vaccines caused greater numbers of neuropsychological disorders. That study contained a disclaimer noting that children with autism spectrum disorders were specifically excluded from the data set. Consequently, such children were not examined in the recent study either, and the authors acknowledged that they were restricted in their ability to assess outcomes such as neuro-developmental delay, autism, and autoimmune disorders. The differences between the two groups that were compared were also not significant. Those who were placed in the “timely” group received the recommended 10 vaccines in their first seven months while the “untimely” group received an average of 8. The untimely group, though their shots were delayed, did not actually receive fewer vaccines at each doctor visit, and the study indicates that they may have missed vaccines for socioeconomic reasons rather than intentionally abiding by a different schedule. Finally, the study was only of children receiving shots from 1993 to 1997, the period just prior to that in which the number of vaccine shots increased dramatically.

While, as stated above, these types of omissions and flaws are characteristic of most of the pro-vaccine studies, the fact that Dr. Wakefield’s study has been discredited as well is not necessarily comforting for those wanting to be reassured about the safety of vaccines, as it indicates that his conflicts of interest, as well as an error-filled study, somehow escaped the notice both of the editors of the Lancet and of the dozen co-authors who participated in the research. It must be concluded that we cannot simply take for granted the results of scientific studies from even the best medical journals, having seen what happens when they are subjected to intense scrutiny.  And, above all, we must keep in mind that such scrutiny is not likely to be applied to studies that confirm the scientific consensus on vaccines.

To better determine whether a connection might exist between vaccines and autism, we would need a long-term study comparing the health problems of a control group of completely unvaccinated infants against another group that has the standard vaccine schedule, and possibly additional groups that follow selective or alternative vaccine schedules. No study of this type has yet been done.  Pro-vaccine groups argue that such a study would be unethical, assuming ahead of time that vaccines are safer than the alternative, though this is what the study would be meant to determine.  Though such a study would be expensive, anti-vaccine groups might be able to fund it, were it not for the fact that, having staked their reputations on a link between vaccines and autism, they could not be considered an objective sponsor. Perhaps the main obstacle, however, is that a study of this type would require a large number of children to go unvaccinated and potentially susceptible to disease, and no public or private institution would want to take responsibility and liability for these potential adverse effects. Of course, autism is itself an epidemic that must be addressed, but as long as its cause remains unknown, no institution is officially liable for it.  Only the families of autistic children bear the burden for it.

As the controversy rages on, fewer parents are taking the medical establishment (including the CDC) at its word.  On May 5th, 2010, the CDC announced the results of a study they had conducted on parental compliance with the current recommended vaccine schedule. The percentage of parents who refused or delayed at least one vaccine for their children had increased from 22% in 2003 to 39% in 2008. Why? The parents cited concerns about the safety of vaccines, particularly the risk of autism. If the risks of vaccines are in fact much greater than reported, these parents seem to be making the right choice. However, one must not forget the reason why we vaccinate in the first place: to protect our children from infectious diseases. Eliminating one of the possible causes of autism from your child’s life won’t do them any good if they suffer permanent damage or death from polio, measles, diphtheria, tetanus or meningitis. Therefore, suspecting that the side effects of vaccines may be greater than reported leaves us with no easy decision to make. The overarching question that remains is the same that has pursued us throughout human history: how do we safely protect our children from disease?

 

We’ll take a stab at answering that question in next week’s newsletter, “Building Immunity.”

 

Vaccination: An Overview (Parts 1 and 2)

1. How Vaccines Work

 

We live in a world permeated by microorganisms of all kinds – bacteria, fungi, even microscopic animals and plants. Microorganisms interact with human beings in a number of different ways, in many cases seeking us out as their hosts for mutual benefit. Probiotics, for example, are various species of bacteria that live in our intestines, helping us digest our food and absorb nutrients. But some viral and bacterial microorganisms, known as pathogens or germs, cause disease and death in their human hosts rather than coexisting in a mutually beneficial relationship. Vaccination is meant to be a way of protecting us from these pathogens.

Generally speaking, a vaccine is a biological solution, prepared in a laboratory, that contains a weakened or killed virus or bacteria. A person who receives a dose of a vaccine containing a microorganism becomes immune to the disease caused by that microorganism. For example, the measles vaccine grants immunity to the measles virus and thereby to the disease the virus causes. The vaccine accomplishes this by taking advantage of the amazing immune system that exists in the human body.

The immune system is a network of biological processes that combine to protect us from infectious agents such as the pathogens mentioned above. Components of the immune system include physical barriers like skin and mucus but also interior protective agents such as white blood cells and interferons (proteins that protect us from viruses). Our most complex and advanced form of immunity, known as adaptive immunity, involves antibodies (aka immunoglobulins). Antibodies are specific proteins that the immune system produces upon encountering a foreign substance such as a microbe (aka an antigen). An antibody enables the body to more quickly recognize and neutralize the antigen to which it corresponds. As a result, after just one encounter with a pathogen, we can become permanently immune to it upon any future encounters. In other words, due to our ability to produce antibodies, we are able to adapt to an attack such that the same attack won’t work on us twice.

When we are injected with a dose of a vaccine containing a weakened or killed virus or bacteria, the immune system kicks into gear and fights off the pathogen, at the same time producing antibodies against it. Ideally, the pathogen will be weak enough to pose no danger to the body, but strong enough to still stimulate antibody formation. That way, if we encounter the pathogen in the future, we’ll have the antibodies ready to fight it off regardless of its strength. In other words, we’ll be immune to it.

Most vaccines contain, in addition to the pathogen, the following ingredients: animal or human tissues, which serve as a medium in which the pathogen can be cultured; a preservative (such as thiomersal, a mercury-containing compound, or formaldehyde) to keep other pathogens from contaminating the vaccine; a stabilizer such as MSG, to prevent the vaccine from being damaged by heat, light, acidity or humidity; and an “adjuvant,” usually aluminum, which is a substance that increases the response of the immune system. These ingredients, which differ depending on the vaccines, are the result of many decades of research on how to make vaccines safe, effective, and cost-effective.

The most crucial balance to strike in making a vaccine is between a too-strong pathogen and a too-weak one. In the former case, the pathogen may overwhelm the recipient’s immune system, resulting in disease; in the latter case, the pathogen may not stimulate lasting immunity. For example, the oral polio vaccine, which used a live polio virus administered in a similar manner to the way the actual polio virus is contracted, actually caused polio and subsequent paralysis in a small number of children each year before it was discontinued in the early 2000s. For this reason many vaccines are injected, entering the body via the bloodstream, and feature weakened or killed pathogens, relying partially on the afore-mentioned adjuvants for additional stimulation of the immune system. However, this method, presumably because it bypasses certain aspects of the immune system, sometimes does not result in lasting antibody production, in which case it does not confer permanent, lifelong immunity in the subject (hence the need for recurrent “booster shots” of certain vaccines ). In contrast, immunity from a naturally contracted infection is more likely to be permanent, but the risk of serious disease is much greater when acquiring immunity in this way. This dilemma of safety versus effectiveness, of stimulating immunity without harming the patient, has been present since the earliest and most rudimentary attempts at vaccination.

 

2. The History of Vaccination

 

Observing the progress of the Plague of Athens in 430 BC, the Greek historian Thucydides wrote that the plague (now thought to be typhus) “never took any man the second time so as to be mortal.” Those who got sick but survived did not have to fear dying from the disease later on. Similar observations of adaptive immunity may have been what led seventh century Buddhist monks to adopt the practice of drinking a small amount of snake venom to make them immune to the poison from an actual bite. In ancient China, the most threatening disease was smallpox, and by the 10th century one Buddhist nun had found a method for treating smallpox with inoculation. Inoculation, a more general term than vaccination, is the placement of something into a medium in which it can grow and reproduce, such as a plant part grafted on to another plant, or an antigen into a human body. Inoculation with smallpox for immunization purposes is known as variolation.  Over the next few centuries, variolation became common practice in China as a means of providing some protection against smallpox.

Ancient Chinese methods of variolation generally consisted in drying and pulverizing smallpox scabs from people with mild cases of smallpox and blowing the scab powder into the nostrils of healthy people. The mild cases were chosen for the same reason that vaccine makers now often use weakened or killed pathogens: to reduce the risk of inducing a serious infection. Another form of variolation was to have healthy children wear the undergarments of infected children for several days – a tactic similar to the chickenpox playdates of the 20thcentury, prior to the invention of the chickenpox vaccine.

Similar forms of variolation were eventually practiced in India, Byzantium and the Middle East. Due to various causes including the Crusades, the slave trade, and other forms of trade, smallpox spread to Europe and the Americas, and variolation followed.  Variolation techniques now included applying smallpox scab powder to cuts or scratches on the skin, and the process was slowly accepted in the West as a preventative against the disease, though many distrusted it based on its Oriental origin. The major drawback of variolation, however, was that people occasionally developed serious cases of smallpox from the procedure, and either died or suffered scarring and blindness. People sometimes feared the preventative almost as much as the disease itself.

In the eighteenth century, smallpox was widespread throughout England, but one group of people were curiously unaffected by the disease: dairy workers. Through their contact with cows, dairy workers typically became infected with cowpox, a disease similar to smallpox but much less dangerous, which was spread by touch from the infected udders of cows to humans. Cowpox was similar enough to smallpox that the antibodies produced by the infected workers could fight off smallpox microbes as well as cowpox microbes. One of the first people who took advantage of this phenomenon to deliberately induce immunity was an English dairy farmer, Benjamin Jesty.  In the year 1774, during a local smallpox epidemic, Jesty infected his family with the cowpox virus that had already infected his servants and workers. The family easily recovered from the cowpox virus and were untouched by smallpox.

Other farmers carried out similar experiments with success. Eventually, word of this immunization method reached the surgeon and scientist Edward Jenner, who in 1796 decided to test it out by inoculating his gardener’s eight-year-old son with pus from a milkmaid’s cowpox blisters, and then deliberately injecting him with smallpox (scientists had a little more leeway to experiment freely back then).  Since the smallpox virus did not appear to affect the boy, Jenner announced that he had been successfully “vaccinated,” deriving the term fromvacca, Latin for “cow.” Jenner continued to test vaccination on dozens of additional subjects with immediate success, and thanks to his connections in scientific and government circles, was able to widely publicize his findings. He also founded an institution to promote his method, and the British government soon banned variolation in favor of vaccination.

Over the course of the 19th century, vaccination against smallpox became standard practice in most European countries, and was in some cases mandatory. However, smallpox epidemics continued, particularly during times of stress and upheaval.  During the Franco-Prussian war of 1870-72, a smallpox epidemic struck France and Germany and killed over 100,000 people. Jenner himself became aware that both the safety and the effectiveness of the smallpox vaccine were less than ideal. He had discovered that a significant number of people still developed smallpox even after vaccination. They also sometimes became infected with other diseases that had contaminated the vaccine. As for the immunity from vaccination, it generally only lasted 3-5 years and then began to decline.

What Jenner did not know was the nature of smallpox and how it was transmitted. Only by the end of the 19th century did scientists investigating both smallpox and the many other infectious diseases that were prevalent at the time (tuberculosis, diphtheria, cholera and typhus, among others) come up with the famous germ theory of disease. The germ theory stated that each individual infectious disease was caused by an individual, microscopic, living organism. The noted French chemist Louis Pasteur was a major contributor to the theory, having proven that microscopic organisms, good and bad, do not generate spontaneously but reproduce by subsisting on nutrients, and can be airborne or anaerobic. Pasteur subsequently put his discoveries to use in developing pasteurization, the method of heating liquids to kill most microorganisms present within them.

 

The germ theory of disease enabled scientists to more easily develop vaccines against infectious diseases besides smallpox. Pasteur himself worked on vaccines against rabies and anthrax. Aided by his expertise in microbiology, he discovered methods for attenuating (weakening) bacteria in vaccines so that the vaccines could confer immunity with less risk of actually causing disease.  In the following decades, scientists further refined and improved the techniques of vaccine development, introducing vaccines for diphtheria, tetanus, and whooping cough prior to World War II. A polio vaccine was developed during the early 1950s. Since then, vaccines have been developed for many other infectious diseases: measles, mumps, rubella, hepatitis A and B, meningitis, chickenpox, flu and most recently HPV and rotavirus. Today, each disease against which we routinely vaccinate has a small or nonexistent incidence in the developed world. If the 19th century was the Age of Infectious Disease, the 20th century was the Age of the Vaccine.