The USDA’s MyPlate Eating Guide

On June 2, 2011, the USDA, in conjunction with First Lady Michelle Obama, released MyPlate, which replaced MyPyramid as their guide for how Americans should eat.  For decades, the government has been trying to consolidate nutrition advice from health professionals and pass it on to Americans in a clear, easy to follow format, especially as our obesity, heart disease and diabetes rates have increased over the same time period. However, in giving advice, the USDA also has been careful not to be to too strong in warning people away from the processed food produced by the American food industry, which is both a major part of the economy and a major reason why Americans are so sick. As a result of its conflicting obligations, the government’s advice is often contradictory and confusing, and the MyPlate guide is no exception (though it is marginally better than its predecessors).

The plate that now replaces the pyramid as the icon of how to balance our diets consists of four approximately equal sections, one each for fruits, vegetables, grains and “protein.” There is also a separate cup beside the plate, labeled dairy. The USDA has boiled down its directives to the following simple messages: 1. Enjoy your food, but eat less; 2. Avoid oversized portions; 3. Make half your plate fruits and vegetables; 4. Make at least half your grains whole grains; 5. Switch to fat-free or low-fat (1%) milk; 6. Compare sodium in foods like soup, bread and frozen meals – and choose the foods with lower numbers; and 7. Drink water instead of sugary drinks.

Will MyPlate actually help stem the obesity epidemic? According to Michelle Obama, quoted in the USDA Press Release, “As long as [our kids’ plates are] half full of fruits and vegetables, and paired with lean proteins, whole grains, and low-fat dairy, we’re golden.  That’s how easy it is.” Unfortunately, MyPlate will probably not have much of an effect, despite the fact that some very good advice can be found among the USDA’s recommendations. A major reason for this is that MyPlate, in addition to offering good advice, also offers some bad advice, and fails to offer any advice at all in some crucial areas. As a result, those who try to follow it conscientiously will find themselves feeling hungry and craving junk food after meals, and those who follow it less conscientiously will find plenty of wiggle room within the guidelines for including lots of processed food.

For example, MyPlate instructs us to make at least half of our grains whole grains. Because whole grains are naturally more dense and fibrous than refined grains, and have a more complex flavor, they need to be paired with a healthy fat, like cold pressed olive oil, or butter from grass-fed cows, to be appetizing. Combining whole grains and healthy fat also helps us feel full exactly at the point when we’ve eaten the right number of calories. But in MyPlate, fat is either frowned upon or relegated to the background. In following MyPlate, people will try to eat whole grains with little or no fat, and will find them unpalatable.  They will therefore gravitate towards the refined grains, which they are permitted to eat an astounding 50% of the time. What’s ironic is that refined grains, not fats, are what cause us to gain weight, because while both are high in calories, fat makes us feel full but refined carbs leave us constantly hungry. Thus does MyPlate’s bad advice (reduce fat) actually nullify our ability to follow its good advice (eat more whole grains). What about the other recommendations? Let’s take a look at them one by one.

1.      Enjoy your food, but eat less. The first part of this message is good – food is meant to be a source of pleasure as well as nutrition and sustenance. The second part, however, propagates the misconception that we need to cut down on the foods we enjoy in order to be healthy. In fact, the most enjoyable foods also happen to be the healthiest, and when we eat these foods, we feel full right at the point when we’ve had enough. Only when we’re following a flawed plan like MyPlate do we need to worry about “eating less.”

2.      Avoid oversized portions. Again, when we’re eating a healthy diet, we can let our cravings dictate how big of a portion we need. Sometimes it will be large, sometimes it will be small, but it will always reflect what our body needs at that moment. MyPlate’s vague, one-size-fits-all statement doesn’t offer any concrete guidance.

3.      Make half your plate fruits and vegetables. Most people don’t eat enough fruits and vegetables, so this advice points in the right direction. But fruit shouldn’t usually be eaten with other foods – it’s better digested eaten alone, as a snack or dessert. Though a one-size-fits-all approach still has flaws, a better general recommendation would be making your plate 1/3 vegetables, 1/3 whole grains, and 1/3 meat, eggs or beans.

4.      Make at least half your grains whole grains. Again, this recommendation should really be “Make all your grains whole grains.” If whole grains are better (and they are) we should eat them all the time; there’s no need for refined grains. The USDA doesn’t want to acknowledge this explicitly because so many food products are made with refined grains.

5.      Switch to fat-free or low-fat (1%) milk.  First of all, fat makes you full, not fat, so following this recommendation won’t help reduce obesity. Saturated fat, the type in milk, was at one time linked to heart disease, but it’s since been discovered that the real culprit is theabsence of another type of fat, omega-3 fatty acids, which are lacking in the milk and meat of factory-farmed cows. This recommendation should therefore be “Switch to whole unpasteurized milk from grass-fed cows raised on small family farms.” Finally, milk is not an essential food; the USDA implies that it is in order to satisfy the dairy industry. However, the nutrients it contains can be found in other foods, such as beans, eggs, and green leafy vegetables.

6.      Compare sodium in foods like soup, bread and frozen meals – and choose the foods with lower numbers.  While this is good advice if you’re going to be buying pre-made soup, bread, and frozen meals, the best way to get a healthy amount of sodium, and what the USDA should recommend, is to make your own soup, bread, and meals, adding salt until it tastes right to you.

7.      Drink water instead of sugary drinks. Hooray! The USDA got one right – sort of. People crave sugary drinks often because their diets are already imbalanced – and following MyPlate’s recommendations won’t take away that imbalance. Most people will just not be capable of following this advice, especially if they are eating MyPlate’s way. So the soda manufacturers don’t have much to fear.

 

As a response to MyPlate, I’ve created the following alternative simple five-step eating plan, which I think would vastly improve the health of all people who adopted it, and which contains recommendations that complement one another:

1.      Eat whole foods, or foods with whole-food ingredients. For example, tomatoes, or tomato sauce containing tomatoes, garlic, herbs, and olive oil, but no sugar. With each meal, try to get some of the foods in each of the following macronutrient categories:

a.      Complex Carbohydrates: Grains such as brown rice, whole wheat or whole wheat flour, quinoa, barley, oats, corn, and buckwheat; Starchy vegetables such as potatoes, sweet potatoes, and squashes; Fruits such as apples, pears, melons, bananas, plums, mangoes, oranges, and grapefruit ; Natural sweeteners such as maple syrup, agave nectar, brown rice syrup, barley malt, and raw honey.

b.      Protein: Animal products such as beef, poultry, lamb, pork, milk, cheese, fish and eggs; Beans and bean products such aslentils, black beans, kidney beans, navy beans, chickpeas, tofu and tempeh; Nuts and seeds such as almonds, peanuts, walnuts, sunflower seeds, pumpkin seeds, and cashews.

c.      Fats: Vegetable oils such as olive oil, sesame oil, coconut oil and corn oil; Animal fats such as butter and lard.

d.      Vitamins & Minerals: Vegetables such as  greens (kale, collards, chard, bok choy, spinach, etc.) roots (beets, carrots, radishes, turnips, parsnips, etc.), bulbs (onions, garlic, celery, scallions, etc.); nightshades (peppers, tomatoes, eggplant), gourds (cucumber, summer squash), and many more; Fruits such as berries, lemons and limes; Herbs and Spices such as basil, oregano, thyme, rosemary, sage, garlic, pepper, cumin, cardamom, cinnamon, cloves, ginger, etc.

e.      Microorganisms: Raw fermented foods such as yogurt, kombucha, raw sauerkraut, kimchee, miso, and kefir.

2.      On average, eat about 1/3 carbs, 1/3 protein, and 1/3 vegetables with meals. The right way to balance a diet differs from person to person based on body type, activity levels, climate and environment, season, gender, age, and so on. If you’re eating healthy foods, then by listening to your cravings, you can figure out what balance is right for you at any given time. Some other notes: Fruit, especially raw, doesn’t digest well with most other foods, so it should be eaten as a snack or a dessert. Fats, such as olive oil or butter, can be obtained separately or from eating the whole food in which they are originally found (olives and milk, in this case). Herbs, spices, and sweeteners should all be used in small amounts, to flavor foods. As seen above, some foods contain more than one type of nutrient and can meet more than one requirement at once.

3.      Use good quality ingredients. All plant foods – grains, fruits, vegetables, herbs, spices, vegetable oils, etc. – should be organic when possible. Animal products should come from animals raised on their natural feed, and if that feed is organic, even better. Fruits and vegetables should be fresh and in season; if locally grown, even better. Vegetable oils should be cold pressed and packed in a dark bottle, in addition to organic. Milk is best unpasteurized, if from healthy cows; if pasteurized, choose organic and grass-fed. Even if organic, foods such as beans or tomato sauce are better cooked from scratch than from a can.

4.      Eat home cooked food at least 80% of the time; when eating out, choose restaurants that also follow the recommendations above.  When eating home cooked food, you can fully control the quality of the ingredients, the balance of the meal, and adjust the flavors and proportions as suits your body’s individual needs.

5.      If the above steps are followed, eat in accordance with your cravings.  We’re taught that food cravings are to be resisted. But when we’re eating a healthy and balanced diet, our bodies naturally start to crave those healthy foods, and crave them in just the right quantity and proportion. As a result, once you’ve introduced your body to the healthy way of eating described above, you don’t need to worry about calorie counts, portion sizes, or how much fat or carbs you are getting – you can listen to your cravings, and they will guide you towards foods that will help you to achieve our natural weight.

 

Such an eating plan, if many people followed it, would result in major upheaval in the American – and world – economy. Not only is our food system global, but people in the rest of the world look to Americans as their example for how to eat.  The sugar, dairy, meat, processed food industries would never abide by it, for it would result in almost total abdication from their products in favor of a more local, farm-to-table based economy.

This brings us back to the conflicts inherent in MyPlate. Programs like MyPlate and MyPyramid have tremendous influence, not in getting people to be healthier, but in giving them a flawed conception of what is healthy and what is not, and actually reinforcing them in eating processed food by giving them an unappetizing alternative.  Instead of trying to give people advice that is skewed by vested interests, the government should eliminate the subsidies that make processed food artificially cheap, so that it’s easier for us to make our own choices to eat healthier. While MyPlate is better than previous guides, it’s inherently insincere, as it is committed to the status quo rather than allowing a new food economy that actually supports our health.

Preventing Pneumonia in Children

This past winter and spring, I kept hearing about how children in the families I knew were coming down with pneumonia. In fact, pneumonia, which is a condition of inflammation and fluid buildup in the lungs associated with infection, is fairly common in our country; it affects 5.6 million people per year and is the 6th most common cause of death. In the winter and spring, when our immune systems are weakened by the cold and then overloaded by pollen, respiratory issues are at their worst. Fortunately, since pneumonia in children is usually caused by a bacterial infection, it can be treated with antibiotics.  Unfortunately, for increasing numbers of children, pneumonia is not an isolated event, but the consequence of chronic respiratory illness, such as asthma. This type of chronic illness is a condition to which antibiotics actually contribute, due to their overall weakening of the immune system.

Although I criticized antibiotics in last week’s newsletter, what I’m really criticizing in both cases is their overuse. Pneumonia is the leading cause of death in children in poor countries, and it doesn’t kill children here because of antibiotics and other tools of modern medicine. But while antibiotics may prevent serious illness and death, they don’t establish general health. Just prescribing them over and over again won’t prevent future illness. The real question is why our children are getting sick in the first place. After all, it’s easy to identify the causes of respiratory illness in poor countries. In these countries, children are malnourished because they can’t access the food they need. They are more likely to be at risk of lung pollution from the poisons accumulated in the environment due to unregulated industry or ongoing war. But by and large, we don’t have those concerns. So why do American children get sick as much as they do?

The reason lies in the fact that, in many ways, our children are the extreme opposites of their third-world counterparts. Instead of suffering from getting too few calories and protein, they get too much, in the form of sugar, white flour, factory farmed meat, and rancid or hydrogenated oils. This excess of protein, calories and toxins is more than the liver and kidneys can handle, leading to a backlog that accumulates in the lungs and sinuses as phlegm and mucus, and causes the symptoms of wheezing and shortness of breath that are seen in asthma.  This backlog also makes it harder to the immune system to effectively deal with respiratory infections. In fact, pathogens are far more likely to take root in a person with a congested, stagnant condition, as congested lungs are just the type of environment in which they thrive.  In addition, as a side effect of our overzealous attempts to create a clean environment for our children, we rely on excessively strong cleaners and antibacterial products, which cause our children’s’ immune systems to be under-developed. The result is that an infection their bodies should be able to zap without a second thought ends up making them sick.

Once children develop chronic respiratory issues, they are often given prescription medicine to take indefinitely – usually some type of anti-inflammatory steroid, the purpose of which is essentially to eliminate the symptoms of wheezing, mucus production, and so forth. However, these symptoms, as said above, are the consequence of the body attempting to detoxify of the byproducts of excessive processed food or allergens. Shutting down the body’s systems works in the short term, but leaves the child vulnerable for more serious illness – such as pneumonia. In addition, such medications have negative side effects such as cramps, sore throat, lightheadedness, dry mouth, upset stomach, even behavioral changes.

The best long-term approach is to put your child on a healthier diet, one that provides the appropriate amount of calories and nutrition. Particularly important to emphasize in the beginning are foods that specifically heal and protect the lungs. Here are some examples broken down according to the different types of foods that help:

Pungent Foods: These foods help to break up and flush out the mucus in clogged lungs and sinuses. Examples are onions, garlic, radishes, horseradish, white peppercorns, turnips and chili peppers.

Cleansing Foods: Green leafy vegetables, which contain nutrients that helps the lungs to eliminate toxic residue. Interestingly, the stalks of many leafy greens, such as kale, collards, mustard greens, and swiss chard, somewhat resemble the lungs.

Immune Boosters:: Golden-orange vegetables contain beta carotene that helps protect the mucous membranes of the lungs. Examples are carrots, winter squashes, pumpkins, turnips, and rutabagas.

Fermented Foods: Raw fermented foods contain active bacteria and enzymes that aid digestion and detoxification and help the immune system fight off pathogenic bacteria. Finding one that your child likes and serving him or her a small amount each day will go a long towards improving their health.  Examples are sugar-free yogurt, raw sauerkraut or kimchee, miso, kombucha, and kefir.

The following foods also aid the lungs in various ways: brown rice, barley, millet, oats, cauliflower, lotus root, celery, white fish, and herbs and spices such as dill, fennel, coriander, basil, bay leaves, cardamom and licorice. One particularly powerful natural medicine is oil of oregano, which has antiseptic and antibacterial properties but does not weaken the immune system. It can be taken internally or inhaled via vapor steaming. Finally, simply breathing deeply on a regular basis helps to heal the lungs.  Shallow breathing results in reduced oxygen, which decreases the capacity of all the body’s systems.

Adding foods is more important than removing them. However, it’s good to know which foods can make respiratory conditions worse. The main culprits are pasteurized dairy products, white flour, sugar, and hydrogenated oils, so they should be eliminated or replaced with their healthier counterparts as appropriate. However, even the healthier versions of these foods – raw dairy from grass-fed cows, whole wheat flour, natural sweeteners, and naturally processed oils – may need to be given in more limited quantities until the child shows freedom from symptoms even when not on medication, as these foods are by nature more heavy and congesting.

If your child has pneumonia, he or she obviously needs immediate medical attention. But if your child has frequent colds or chronic respiratory issues, which may occasionally worsen into pneumonia, you can use the foods and remedies listed above to change the course of your child’s health, help them to detoxify fully, and give them the nutrients they need for strong, healthy lungs.

Raw Milk, the FDA, and the E. coli Outbreak

On April 19th, the Federal Food and Drug Administration filed a complaint  against Pennsylvania Amish dairy farmer Dan Allgyer, alleging that he had violated federal law by delivering raw milk across state lines. The milk was being purchased on a regular basis by a cooperative of buyers in Maryland, a state that has outlawed the sale of raw milk within its own borders. It is legal to sell raw milk in Pennsylvania, but a violation of interstate commerce laws to deliver it to buyers in other states. According to Dara A. Corrigan, the FDA’s associate commissioner for regulatory affairs,  “Drinking raw milk is dangerous and [it] shouldn’t be consumed under any circumstances…[the] FDA has warned the defendant on multiple occasions that introducing raw milk into interstate commerce is in violation of Federal law.”

However, despite their claims about the danger of drinking raw milk, the FDA could not point to any cases of foodborne illness arising from the consumption of Allgyer’s milk. In fact, raw milk in Pennsylvania is already highly regulated.  More than 110 farms in Pennsylvania have raw milk permits that are only maintained via regular and rigorous testing for the kinds of bacteria that cause foodborne illness. As Adam Helfer of the Washington Times pointed out,

 

“The confusion seems to arise from the FDA not understanding and differentiating between conventional milk (which needs to be pasteurized for safety) and raw milk from healthy, pastured animals and clean conditions. It is to be noted that grass-fed raw milk has been consumed safely by cultures for thousands of years.”

 

Cows fed on grass, their natural food, and raised in their traditional environment (open pasture) with plenty of space to graze, are consistently healthy, unlike their factory-farmed, grain-fed counterparts. As a result, their milk is not only more nutritious, but contains significant quantities of beneficial bacteria and enzymes, which protect the milk from pathogenic bacteria. Instead of putrefying, raw milk from healthy cows simply sours over time, as the beneficial bacteria proliferate, and ultimately turns into buttermilk, yogurt and cheese. While raw milk from factory farmed cows would be very risky to drink, raw milk from healthy cows is practically impossible to contaminate, and does not need to be pasteurized.

It all comes down to the question of whether the cows are healthy, and closely monitored – standards which are easily achieved on a small family farm like Dan Allgyer’s.  If these conditions are met, raw milk is vastly superior to pasteurized (whether organic or factory farmed – though organic and pasteurized is superior to factory farmed and pasteurized), both in terms of nutrition and taste, which is why growing numbers of people are choosing to purchase it. The FDA, in ignoring this distinction, conflates all raw milk as equally dangerous, regardless of the cow it came from. Consequently, the FDA considers it necessary to take away our freedom to purchase raw milk.

In an attempt to be generous to the FDA, one could say that they are being busybodies only out of a sincere desire to protect our health. They may be trying to control what we can eat and drink, but at least it is with our best interests at heart. However, not only does the FDA permit the sale of cigarettes and alcohol – both of which, if consumed too frequently, are actual health hazards, unlike raw milk from grass-fed cows – the FDA even overlooks the dangers it has admittedly identified in pasteurized milk.

In January of this year, the New York Times reported that the FDA, each year, finds illegal levels of antibiotics in older dairy cows that are destined for the slaughterhouse. The big dairy companies regularly dose their cows with antibiotics because the factory-farm conditions in which these cows live are so unhealthy – and the cows’ diets are so poor – that they are sick almost every day of their lives.  Since it stood to reason that the dangerously high levels of antibiotics the FDA found might be in the dairy cows even while they are producing milk for human consumption, the FDA was considering testing the milk from the large dairy farms that were the sources of the high-antibiotic cows destined for the slaughterhouse.

However, the FDA’s proposal met with strong resistance from the pasteurized dairy industry. Why? Ostensibly because the testing would take long enough that milk from the cows being tested would have to be put on the market in the meantime. And if the milk turned out to be contaminated with antibiotics, it would then have to be recalled, costing dairy producers millions and harming their reputations. To quote from the New York Times article,

 

“What has been served up, up to this point, by Food and Drug has been potentially very damaging to innocent dairy farmers,” said John J. Wilson, a senior vice president for Dairy Farmers of America, the nation’s largest dairy cooperative. He said that that the nation’s milk was safe and that there was little reason to think that the slaughterhouse findings would be replicated in tests of the milk supply.

 

The danger to us, of course, is that by consuming too many antibiotics in milk, we could not only weaken our own immune systems but also further the evolution of drug resistant strains of bacteria. In fact, one impetus for the new testing is that the antibiotics for which the FDA currently tests are no longer the only ones in use by dairy farmers. Why are so many new antibiotics being used? Because the most common ones are losing their effectiveness as those drug resistant strains of bacteria develop. If there was ever a situation for the FDA to step in this was it. Unfortunately, all the dairy industry had to do was send a “sharply worded letter” to the FDA to get them to withdraw their testing plan for indefinite review.  That means anyone who is drinking conventional pasteurized milk may be drinking a product too dangerous to be on store shelves.

So is the FDA really looking out for us? Or are they just looking for easy targets? It seems that any segment of the food industry that is large and influential is safe from oversight, but a single Amish farmer working hard to provide the highest quality of milk to his small group of buyers is Public Enemy #1. Perhaps the fact that his business is a threat to the big dairy industry is the real reason why scrutiny is on him.

What does the future hold? Marylanders, and residents of other states, who would like to choose raw milk, may have fewer and fewer options.  Congressman Ron Paul has introduced a bill, HR 1830, that would allow the shipment and distribution of unpasteurized milk and milk products for human consumption across state lines; however, the bill is unlikely to pass.  Conventional pasteurized milk will continue to dominate the market for the foreseeable future, and because it achieves its artificially low prices based on unnatural factory farming (helped out by government subsidies on the grains and soybeans it feeds the cows), it will continue to need to pump its animals full of antibiotics just to keep them alive. Those antibiotics will also continue to give rise to new strains of drug-resistant pathogens. There’s one in particular that you might be reading about in the news lately: E. coli O104:H4.

E. coli is a bacterium commonly found in the intestines of warm-blooded animals.  Most strains of E. coli are harmless, even beneficial, contributing to the flora of the gut, but a few (such as serotype O157:H7) produce shiga toxin, which causes hemorrhagic diarrhea and kidney failure.  The current outbreak in Germany is being caused by shiga-toxin producing strain O104:H4 – a new strain that resists more than a dozen commonly used antibiotics, making illnesses caused by it extremely difficult to treat. In the span of a month, it sickened over 3,000 people and killed 36.  European public health officials, desperately seeking the immediate source of the bacteria, first incorrectly guessed it was cucumbers and other raw vegetables imported from Spain; now they are fairly confident it was sprouts from an organic farm in northern Germany. How E. coli O104:H4 got into the sprouts in the first place has not yet been determined.  But outbreaks of shiga-toxin producing E. coli (STECs for short), which have been in existence for less than thirty years, almost always have their ultimate origin in cattle.

A recent article in Bloomberg News quotes Australian veterinary public health researcher Rowland Cobbold as saying that “Cattle are the main reservoir for E. coli, the family of bowel-dwelling bacteria from which the new bug comes…The cucumber [or other raw vegetable] may be the lead back to the original ruminant that was the source…It’s almost entirely likely that it came from cattle at some point.” From the article:

 

Outbreaks of bloody diarrhea caused by E. coli have usually been linked to contaminated meat, Cobbold said. In more recent outbreaks where fruit and vegetables were implicated, E. coli- contaminated manure or irrigation water were found to be the original source, he said.

“If this goes the same way as previous investigations, they’ll find the ‘smoking gun’ — the ‘smoking tomato’ or the ‘smoking cucumber’,” Cobbold said. “They will then follow the production source back to the farm and they’ll work out the various contamination roots.” Most likely that will lead to the “smoking cow,” or at least a specific herd where the strain can be found, he said.

 

Intestines, or fecal matter from the hide of a cow in a slaughterhouse, can mix with meat going into ground beef. E. coli in manure can also spread into nearby fields and water sources and thereby get into vegetables.  As a consequence of the latest outbreak, many are now calling for irradiation of our entire food supply (essentially, pasteurization of cucumbers and lettuce), another solution which would enable food producers to skip quality assurance, and which is likely to give rise to new types of health crises, just as the current system of industrial agriculture has done.

The capacity for pathogenic bacteria to spread continent-wide from a single farm or animal is one of the flaws of our global food system.  But the real issue is what gives rise to a bacterium like E. coli 0104:H4 in the first place: over-usage of antibiotics, the very issue which the FDA, despite its stated mission to protect our health, is hesitating to address. While it’s frightening that similar outbreaks in the future are almost inevitable, it’s deeply ironic that the FDA is busy attacking the very type of farm that, by raising healthy cows in a natural, small, easily monitored environment, and selling its products directly to its local customers, is designed to prevent such global catastrophes

We are not in any danger from farms like Dan Allgyer’s. But we have reason to fear that factory farming, with its unhealthy cows full of antibiotics, will lead to new strains of E.coli that have to potential to repeatedly contaminate our global food system.  If the FDA were serious about limiting the spread of food borne pathogens, it would go to the source of the problem – the poor diet and unhygienic living conditions of the animals who are the initial victims of industrial, factory farmed agriculture.  At the same time, it would leave in peace those who are choosing to bypass the industrial food system for a local system that’s safer, healthier, and more accountable.

 

My Journey with Food, Part 2

In my second year of college, I was eating primarily out of my dorm room, cooking brown rice, tempeh, and other macrobiotic foods with an electric steamer and hot pot. I was no longer depressed, though I was frequently hungry and had to eat heaping platefuls of macrobiotic foods just to feel full. My dependence on junk food was gone, but at the same time I was hesitant to branch out to any foods I wasn’t already familiar with.

Then I met Katy, the woman who would become my wife. She was a year behind me in college, but we had a common interest in swing dancing and were both considering medical school. As we spent more time together, I automatically assumed she would find my food (and me) weird. To my surprise, she didn’t think of me as an organic freak.  Though Katy grew up eating typical American food, the majority of it was home-cooked by her mom. She also had friends in her rural hometown who ate organic, home-grown whole foods, which tasted better than just about anything else she had tried.

The real difference between Katy and I was vegetarianism. I objected to eating meat, and she didn’t. It had the potential to be a serious obstacle to our relationship. But on our first dinner date at an Indian restaurant, Katy offered to let me order for her. I explained that I didn’t know much about meat. “I don’t have to have meat,” she said. “I’ll get what you like.” Not only that, but when Katy learned about my allergies, she applied her cooking and baking skill to making the meals she loved with soymilk or Earth Balance instead of milk and butter, so that I could have them too. Katy’s grace in meeting me more than halfway made me much more willing to try foods she liked that I had been picky about for years

Katy and I both lived off campus during our last two years of college, and did all our own cooking. We gradually ate a greater variety of healthy foods and saw corresponding improvements in our health. In my senior year, I was having a conversation with a friend about diet and was blathering on about the immune system. He promptly suggested I become an immunologist after graduation. Although I had considered becoming a doctor, the idea ultimately did not appeal to me because I felt it would entail treating the symptoms of health problems, rather than the causes. My friend’s suggestion, however, made me realize that advising people on diet and lifestyle would be an effective way to promote health.

The question was where to acquire my education in nutrition. I knew enough of certified nutritionist and dietician programs to turn them down. I didn’t want to tell people to count calories, take supplements, eat artificial sweeteners instead of sugar, and drink more milk “for strong bones.” From my own experience and from the reading I had done, I was aware that nutrition science was not always science-based, and was rarely effective in motivating people to get healthier. After a good deal of research, I located a school called the Institute for Integrative Nutrition, which taught all the major dietary theories, but had a core emphasis on whole foods, traditional diets, stress reduction, and counseling people on overcoming difficult emotional relationships with food.

At IIN, I learned that the Macrobiotic diet on which I had grown up was so effective because of its ancient roots as a traditional Japanese diet, evolving over thousands of years to meet the nutritional needs of the people who lived on it. In fact, pretty much all the solid dietary wisdom we received in school was based on what people ate traditionally, though the specifics of a good diet differed from climate to climate. What remained in common, however, was the principle of eating whole natural foods, in season, and in the right proportions.

To my surprise, I learned that for my body type, some meat in the diet might be necessary for health. Since I was in school in New York and Katy was still in college, I decided to try making a hamburger (grass-fed, free-range) myself. She warned me against it. “Just wait until I visit you, and I’ll make it for you. At least, if you have to make it, make sure it’s not gray in the middle.” I made the hamburger, and as I ate it, wondered what the big deal was all about, and why so many people loved red meat. Of course, the hamburger was gray in the middle – I had been distracted during that part of the conversation with Katy. Eventually she showed me how to cook it correctly, and I started adding more meat and fat to my diet. For the first time in my life, I felt full on a regular basis, and I noticed that the sugar cravings that had plagued me on and off throughout my life were gone.  When Katy’s mom, who had never been entirely comfortable with my vegetarianism, learned that I had gone to a hippie nutrition school and learned to eat meat, she became willing to eat kale on a regular basis (and now likes it).

At IIN, I also learned for the first time about traditional raw milk from grass-fed cows and its greater digestibility when compared with pasteurized milk. Due to my allergy history, however, it was another year until I found myself willing to try it. Since then, I’ve included raw milk in my diet on a regular basis with nary an allergic reaction. Recently, Katy mastered the art of traditional whole wheat sourdough bread, and I’ve been able to eat it as well without a problem.

Nowadays, when people ask me if I have any dietary restrictions, I say “none.”  I’ve gone from someone who always felt like the pickiest eater in the world to someone who is willing to eat anything. It’s not that I think everything is healthy, or right, to eat, but if I want to guide others in dietary matters, I have to be open to trying their food as well, just as my wife was for me on our first date.  While I’ll never be able to eat junk food like I did in college, and still be healthy, the important thing is that I don’t want to. Thanks to my education, I’ve learned how to eat a healthy, balanced diet that meets all my nutritional needs and satisfies my cravings. It’s an area of my life that is no longer a source of stress, nor is it putting me at risk for illness. And while not everyone might thrive on the exact same balance of whole foods that is suited for me, every person is capable of achieving the same type of success with diet and health. What I love about my work as a holistic health counselor is the opportunity to guide others into that place, and to see the amazing and long-lasting improvements in their health that result.

 

My Journey With Food, Part 1

As a holistic health counselor, I regularly give people advice on how to eat and how to develop a positive relationship with food. But my own relationship with food was once very difficult. When I was just a few years old, my parents discovered that I was strongly allergic to wheat and dairy products, and mildly allergic to citrus fruits and nuts. But instead of getting a rash or a runny nose, I would have an emotional breakdown and go into hysterics after eating these foods. Only when the foods were out of my system would I again recover my emotional balance.

Partly to avoid these allergens, my family followed the Macrobiotic diet, which was based on whole, organic foods, particularly traditional Japanese foods. As a result of the diet, we enjoyed good health and energy and rarely got sick.  However, I did have occasional sugar cravings, as well as cravings for the foods to which I was allergic. I also grew up an excessively picky eater. From childhood, I was used to brown rice, miso soup, sea vegetables and greens, and was apprehensive about trying foods outside my macrobiotic “comfort zone.” I dreaded having to eat at friends’ houses or at non-Japanese restaurants where I might be served something I didn’t like. My pickiness, combined with my allergies and my decision to be a vegetarian, meant that finding food I could or would eat was always a stressful situation for me and my family.

During my teenage years, my family stopped following the macrobiotic diet as strictly as before.  Although I wasn’t exactly a “junk food vegetarian”, I didn’t eat as many balanced meals as I had in the past. I liked to snack on rye bread with margarine, trail mix (as my nut allergy had diminished), and corn chips, and I didn’t eat many vegetables. Every once in a while, we had a big macrobiotic dinner that helped keep my health on track, but I didn’t make the connection, instead taking my good health for granted. In fact, when it came time to go off to college, I thought to myself that I would be able to get by on trail mix, energy bars and soy milk, without suffering any health problems. I didn’t even think eating the foods to which I was allergic would be such a big deal.

Unsurprisingly, the campus cafeteria had almost no appetizing vegetarian, non-dairy options. I was constantly hungry, and gravitated towards sugary foods like cookies and candy, which was embarrassing, as all my friends knew I came from a “health food” background.  But I didn’t think very seriously about the consequences of eating so much processed food, and didn’t expect anything bad would come of it. In the meantime, I enjoyed eating my junk food far better than the poorly prepared whole grains, beans, and vegetables in the cafeteria.

Everything changed, however, over the course of one Sunday in my second semester of freshman year. I was enjoying college in general and had been having a particularly good week. But in the midst of a normal conversation with my friends after breakfast, I began to feel an overwhelming sense of despair. I had no idea where it was coming from, but it got worse over the course of the day. I hoped inwardly that a good night’s sleep would banish it. I recognized this strange feeling as one that had come over me during the last few days of my first semester of college, just before I went home for a few weeks. At that time, it had not been as strong, and right after it happened I had benefited from a lot of macrobiotic home cooking. This time, however, my depression did not go away overnight, over the weekend, or even during the next week – it just got worse. There was nothing going on in my life to be depressed about, but I couldn’t shake the feeling regardless. My life – all of reality, in fact – felt empty and meaningless, and I felt terribly sad, but for no good reason. No breakups, no deaths in the family, no financial worries, no legal issues. College was hard work, but I had been relishing the challenge.

Without a macrobiotic background, I might have just chalked my inexplicable depression up to a chemical imbalance in my brain and gone to a doctor or psychiatrist for mood-altering medications. But instead I called my parents and told them what was happening. They immediately recognized the symptoms of my allergies, and I acknowledged that I had been eating lots of sugar, white flour, and dairy products in candy and baked goods, while almost completely avoiding vegetables.

Although it didn’t take away my severe depression by itself, my parents’ theory sounded plausible to me. I didn’t feel like taking care of myself, but nevertheless I forced myself to put the effort into eating differently in the hope that it would take away the horrible emotions I was experiencing. I bought a rice cooker and vegetable steamer to go with my electric hot pot, and started making macriobiotic lunches and dinners in my dorm room. Within the next few weeks I gradually began to feel better, but remained anxious that the improvement was only temporary. In the end it took several months of eating right and avoiding junk food entirely for my depression to fade away. In the next semester I borrowed some macrobiotic books and began teaching myself to cook some basic meals. Having seen the effects on myself of healing foods in action, I became fascinated with whole foods, their benefits for various health problems, their traditional usage, and how to prepare them. Mainly, I realized how much I didn’t know about nutrition and health – and how many foods I had never even tried. Even despite my lifelong allergies, I had devalued and ignored healthy food, the very thing that made it possible for me to function, and I clearly had a lot to learn.

 

To be continued next week!


Vaccination, an Overview, Part 5: Two Approaches to Vaccination

The lack of knowledge regarding the primary cause of the epidemics of disease in the previous centuries – that is, malnutrition and exhaustion leading to weak health – can be attributed to the medical establishment’s general emphasis on the symptoms, rather than the causes, of illness. This emphasis is not a feature solely of modern medicine or even of mainstream medicine. Throughout history, both mainstream doctors and alternative health practitioners have sought to provide us with “quick fixes” and convenient solutions to the symptoms of problems initially caused by poor diet and lifestyle, rather than recommending a diet and lifestyle that would help prevent such problems in the first place.

The first flaw in this approach is that the medical interventions intended to eliminate our symptoms invariably have significant side effects of their own, which then require further interventions with additional side effects. A second flaw is that in situations where such interventions are not available, if we have not been taught how to take care of our health, we are entirely at the mercy of disease.

            We can think of suffering a severe reaction to an infectious disease as a “symptom” of being in poor health in the first place. However, the medical establishment, by minimizing the role played by a healthy immune system, implies that we are all equally defenseless against pathogens. The consequences of this approach are the ever-increasing vaccine schedule and also our society’s “germophobia.” In fairness, in recent decades, acknowledgement of the importance of diet and lifestyle on healthy immune function, and of the abilities of the immune system itself, has increased. But when flu season strikes, the message we hear is primarily to get vaccinated and to minimize the spreading of germs. Just as is frequently the case with diet, we try to eliminate what’s bad but we don’t try to replace it with what’s good.

In taking the vaccinate-and-sanitize approach to fighting disease, rather than an approach that promotes health, we may simply have shifted the battleground. Part 3 already discussed the real and potential debilitating side effects of the vaccines we administer to our children and ourselves.  As for sanitization, while it can eliminate potentially harmful germs from the environment, it often does so by means of toxic chemicals. And lack of exposure to germs is likely to result in oversensitive immune systems that will react negatively to pollen and commonly eaten foods.

In accordance with the second flaw of the symptom-focused approach, children are left vulnerable in instances where vaccines do not succeed in providing immunity, where they are unavailable, or where they have not been invented. And when unhealthy children enter  germ-laden environments, as is inevitable given that they spend more time in the doctor’s office or hospital, they are at serious risk. Consider that in the United States an estimated 80,000 individuals per year die from infections received while in hospitals, most frequently due to catheters, which is not unsurprising given that hospital workers only wash their hands about 70% of the time. While our general environment, and our hospitals as well, are far more hygienic now than they were 150 years ago, and we are generally better nourished as well, we still have a long way to go.

As a short-term strategy, vaccinate-and-sanitize saves lives. But like most approaches that only seek to address the symptoms, not the causes of our health problems, it inevitably results in new, mysterious health problems. At best, focusing on vaccines largely maintains the status quo.  The Bill & Melinda Gates foundation is an excellent example. The world’s largest private foundation, it possesses an endowment of $33 billion, and devotes a significant portion of its resources to improving global health. Vaccination and medication programs in third-world countries are the primary beneficiaries of these resources. However, after decades of giving, vaccine-preventable and other diseases still persist in these countries, and the individuals receiving vaccines and medications often lack basic needs such as nutrition, clean water, and transportation. If they can even access the medications, they may not have enough food to digest them. The wealth of the foundation is ultimately directed to the already wealthy pharmaceutical countries, while the residents of third world countries remain malnourished and impoverished.

In the long run, symptom-focused strategies tend to benefit more those who promote them than those who are subject to them.

A better way to handle the threat of infectious disease would be to create the conditions for healthy, strong immune systems in children.  As discussed in Part 4, these conditions include eating a diet based on whole foods (and breast milk in the case of infants); drinking clean water as the primary beverage; getting enough rest and enough exercise; and reducing stress. Like a muscle, the immune system must also be exercised in order to be strong.  Natural “vaccination,” or technically, immunization, can occur when we are exposed from a young age to a wide variety of microbes in raw and fermented foods, in breast milk, even in dirt. It is a process not too different from that by which Benjamin Jesty’s dairy workers were naturally protected from smallpox.  In fact, a healthy child’s natural exposure to more mild infectious diseases, such as chickenpox, may be beneficial for healthy immune development. A well-nourished child with a well-exercised immune system is strongly equipped to handle the pathogens he or she is likely to encounter, and is likely to be one of the vast majority of children who do not suffer severe reactions to more serious diseases such as measles, or even polio or diphtheria.  A child who, in contrast, is raised on a diet of largely processed food, with little exposure to beneficial bacteria, has a sedentary lifestyle, and suffers from frequent colds and ear infections which are treated with antibiotics instead of fought by the immune system, is more likely to have a severe reaction to a strong pathogen and, in the absence of a change in diet and lifestyle, would probably benefit from the protection of most vaccines, despite their potential side effects.

Unfortunately, the well-nourished child is far more rarely seen in our society than his or her conventionally-nourished counterpart. The medical establishment has, for the most part, chosen neither to study nor promote practices that make us more healthy and consequently less dependent upon vaccines, medications, supplements, sanitizers, and surgery.  Most doctors acknowledge that whole foods are better than processed foods, breast milk superior to infant formula, and some exercise better than none. Some may even recommend playing in the dirt over sitting inside all day. But junk foods continue to infiltrate our schools just as recess programs disappear from them. Infant formula is pushed on women whose babies are not growing at rates arbitrarily determined to be acceptable. Many people consider it inconvenient to try and be healthy or to allow the immune system to fight disease, which is the primary reason why a chickenpox vaccine, for example, was invented. As long as our attitude towards health is symptom-focused, and values short-term convenience over long-term wellness, our health will remain vulnerable and our need for medical interventions will grow.

In the late 19th century, during the worst epidemics of disease, the author and social theorist Leo Tolstoy wrote, in an essay entitled Modern Science:  “The defenders of present-day science seem to think that the curing of one child from diphtheria, among those Russian children of whom 50% (and even 80% in the Foundling Hospitals) die as a regular thing apart from diphtheria must convince anyone of the beneficence of science in general…our life is so arranged that from bad food, excessive and harmful work, bad dwellings and clothes, or from want, not children only, but a majority of people, die before they have lived half the years that should be theirs…And, in proof of the fruitfulness of science, we are told that it cures one in a thousand of the sick, who are sick only because science has neglected its proper business.” According to Tolstoy, though science was an incredibly valuable tool, when misdirected it did not benefit humanity. To be beneficial to us, medical science must be guided by wisdom and foresight, rather than shortsightedness, and should possess a healthy respect for and inquisitiveness into the capacities of the healthy human body.

The ultimate question of whether and how to vaccinate your child is a difficult one, and the answer is not the same for everyone. There is no utterly risk-free approach to take; even the healthiest person can still succumb to a powerful pathogen, as can the most thoroughly vaccinated person. Your decision must involve an awareness of your child’s likely susceptibility to each disease against which we vaccinate, and a calculation of the benefits of the vaccine against the risks of its possible side effects. Regardless of your decision, however, the best thing you can do for your children is to make the necessary changes in your diet and lifestyle for the promotion of health. For further reading on the risks and benefits of vaccines as well as strategies for strengthening the immune system, I suggest you consult one or more of the books listed below.

 

Vaccinations: A Thoughtful Parent’s Guide by Aviva Jill Romm (2001). Discusses vaccines from a historical perspective and contains natural and herbal remedies for common childhood diseases as well as recommendations for building immunity naturally. Romm is a certified professional midwife, a practicing herbalist, and a physician.

 

What Your Doctor May Not Tell You About Children’s Vaccinations by Stephanie Cave, M.D. (2001). Explores the possible relationships between vaccines and autoimmune diseases/developmental disorders, and contains an overview of vaccines and the legal issues related to them, as well as an alternative vaccine schedule.

 

The Vaccine Book by Robert Sears, M.D. (2007). Contains a detailed guide to the current vaccine schedule, including a discussion of the severity and rarity of each disease and the ingredients and side effects of each vaccine. Also contains an alternative vaccine schedule.

 

The Vaccine Guide by Randall Neustaedter, O.M.D. (1996, 2002). Provides an extensive and technical overview of research on the safety of vaccines and the results of that research.

 

How to Raise a Healthy Child in Spite of Your Doctor by Robert Mendelsohn, M.D. (1984). Covers the most common childhood ailments and the appropriate treatments for them. Also contains a section on diseases commonly vaccinated against, their severity, and the effectiveness of the vaccines.

 

Vaccination, an Overview, Part 4 Building Immunity

As discussed in Part 1, vaccines are designed to stimulate the immune system. In fact, their effectiveness really comes from triggering the body’s own natural processes of adaptive immunity. The underlying assumption of vaccination is that the immune system is inherently not likely to be strong enough to handle a disease when it encounters it in nature, which is why we need vaccines to safely and artificially engineer the encounter with disease. This assumption, which has its origin in the aforementioned germ theory of disease, is perhaps an understandable one. In the late 19th century, scientists had observed epidemic after epidemic of infectious disease resulting in millions of casualties. It was reasonable in them to consider that the pathogens they had discovered were indiscriminately deadly. However, one scientist of the era, a French biologist named Antoine Bechamp, had a different proposition for why people were succumbing to infectious disease at great rates: he pinned it on their weak health.

Bechamp, a contemporary of and influence upon Pasteur, would have agreed with Pasteur’s arguments that methods of sanitization (such as hand-washing and pasteurization) would prevent the spread of disease by eliminating pathogens from the local environment. However, Bechamp’s theory was that most people who suffered from infectious disease did so because their own bodies were, in a sense, “unsanitized,” but on a cellular level. According to Bechamp, when we are in a diminished state of health, our cells and tissues form a breeding ground for microorganisms (or microzymas as he called them) that are largely already present in our bodies, but which do not take on a harmful form or reach harmful levels without the supportive environment provided by an sick individual.  Bechamp’s theory formed a contrast to an interpretation of the germ theory that identified external pathogens as being the sole and direct cause of infectious disease, regardless of the prior health of the diseased person.

Meanwhile, Robert Koch, a contemporary of Bechamp and Pasteur, had formulated four postulates meant to establish a causal relationship between a unique pathogen and a unique disease. The postulates specified the following: (1) the pathogen should be in all organisms suffering from the disease, but not in healthy individuals; (2) it must be possible to isolate the pathogen and grow it in a pure culture; (3) it must cause the disease when introduced into a healthy organism; and (4) it must then be possible to re-isolate it from the inoculated organism and found identical to the original. Koch later had to change the first postulate after finding cases of healthy, asymptomatic organisms carrying the bacteria that cause cholera and typhoid fever, respectively. He also had to change the third postulate after finding that not all organisms exposed to a pathogen will display symptoms of infection.

Koch’s findings indicated that both Bechamp’s and Pasteur’s theories had some merit. Pasteur’s disease-centered approach, which relied on sterilization, pasteurization, quarantine, and sanitation, was focused on preventing the spread of disease by eliminating the pathogen from the external environment. Bechamp’s health-centered approached was based on making the individual stronger and healthier, and thereby better able to prevent pathogens from gaining a foothold within the environment of the human body. Although the specific mechanism of Bechamp’s theory – that of microzymas arising from our own tissues to form pathogens – has never been proven, scientists have since discovered that our health plays a tremendous role in the effective functioning of our immune systems, and consequently affects how easily we succumb to infections.

The human immune system is a conglomerate of many different body parts and processes. The skin, liver, kidneys, respiratory tract, intestinal flora and more all play a role in destroying pathogens by means of inflammation, white blood cells, and antibacterial or antiviral chemicals and enzymes.  Those pathogens are discharged via the cleansing and flushing action of tears, urine, mucus and diarrhea. The cells that form the adaptive part of the immune system are able to retain memories of specific pathogens and thereby easily neutralize those pathogens with antibodies upon future encounters.

All systems, mechanical or biological, cannot work properly unless supplied with the proper fuel or raw materials.  A car cannot run without fuel, nor an ecosystem without water, oxygen, and sun.  Our immune system is no different; in order to function, it must be providedwith needed nutrients. Vitamin D, the antioxidant vitamins A, C, and E, Vitamin B6, folic acid, and the minerals zinc, copper, iron selenium have all been found to be vital for promoting the health of the immune system, as have the beneficial bacteria contained in raw fermented foods. Other nutrients contained in whole foods that are as yet unstudied or even undiscovered may be similarly essential. Where infants are concerned, breast milk provides, in addition to needed nutrients, a variety of immunologic factors such as immunoglobulins (antibodies), the enzymes lysozyme and lactoferrin, and lymphocytes (white blood cells). These ingredients promote not only the health but also the growth and strengthening of the infant immune system and protects against the routine infections that are much more commonly seen today in babies that are fed formula, which does not contain immunological factors. Along with nutrition, people need a certain amount of rest and sleep, as well as moderate exercise and clean water, in order to maintain healthy immune function. Stress, extreme conditions, exhaustion and dehydration all weaken the immune system, and they also weaken a nursing mother’s ability to provide nourishing milk.

The scientists who were formulating the germ theory of disease in the late 19th century were living during the tail-end of the Industrial Revolution, a period of enormous social, political and technological change. Economies in Europe and America had shifted from an emphasis on rural agriculture to one on urban industry.  In England in 1750, only 15% of people were living in urban areas, but by 1860, that number had risen to 80%.  Within the cities, the lower classes (both children and adults) had started working long hours in factories for little pay, often doing heavy labor in extreme conditions. As a consequence they were frequently exhausted and, as a result of their poverty, malnourished.  The upper classes, on the other hand, deliberately chose to eat newly available refined foods that were low in nutrients and high in calories, and many women did not get enough exercise or sunlight. They too were weak and sickly and prone to death in childbirth. In sum, the majority of people living during this era were in poor health, with low functioning immune systems, and thereby had reduced resistance to disease.

At the same time, the cities to which so many had relocated lacked the proper waste disposal systems for handling such large populations. Consequently, pathogens were able to contaminate the air, water, food and the streets.  Technology had developed in such a way as to ease the transmission of infectious disease without yet possessing a means to  prevent it. Doctors themselves were some of the worst transmitters. Yet unaware of the need to wash their hands (in many cases outright rejecting the idea), they easily spread fatal pathogens to the many patients, particularly mothers in childbirth, whom they treated in busy urban hospitals.  It is no surprise that infectious diseases ran rampant and that infant mortality was around 40% on average, with the highest rates occurring in the cities.

With the formulation of the germ theory of disease, sanitary practices such as Pasteur and the physician Ignaz Semmelweiss proposed were grudgingly accepted by physicians, with positive results. However, the theory ultimately focused much more on the danger of microbes than the ability of the healthy human body to resist them. As a result, scientists and government officials complemented sanitary practices by arguing the need for vaccines, rather than following Bechamp’s lead in promoting a healthy diet and lifestyle that would simply strengthen the immune system.

Fortunately, due to the explosion of nutrient deficiency diseases during the same period of time, vitamins were gradually discovered and added back into the processed foods from which they had recently been removed. This resulted in better nutrition, which, together with advances in sanitation technology, greatly improved overall health and hygiene in Europe and America following the turn of the century, though the conditions for epidemics were still occasionally created by destabilizing events such as World War 1 and the Great Depression. Smallpox, cholera, tuberculosis, diphtheria, scarlet fever, typhoid fever and other infectious diseases all began to decrease, whether vaccines had been developed for them or not. Endemic diseases like measles, mumps, rubella and chickenpox persisted, but were far less likely than before to cause complications or fatalities.

The only disease to cause epidemics in the developed world into the mid-20th century was polio. Polio, like many of the epidemic diseases of the time, had been around for thousands of years without ever being responsible for major epidemics prior to the late 19thcentury. Since 90% of polio infections cause no symptoms at all, deaths and paralysis from polio were rare. All that changed with the onset of the industrial revolution and the weakened health of the population; suddenly, polio could spread easily, and it met with little resistance in its victims. As sanitation began to improve, fewer people were exposed to the polio virus as young children, when they are least likely to suffer harm from it, and when they can acquire long-term immunity. But while the number of people exposed to polio was lessened, the number of those who died or suffered paralysis increased, since those who did not develop immunity as children encountered it as teenagers or adults, when the disease is more severe.

Additional factors in the severity of the polio epidemic were the rising fads of formula feeding and the tonsillectomy procedure. By 1950 over half of all babies were being fed infant formula (lacking polio antibodies, naturally), which was being promoted as better than breast milk by now-discredited scientific studies. It was around this time as well that performing tonsillectomies became a fad among surgeons and doctors, and in the 1930s and 1940s between 1.5 and 2 million tonsillectomies were being performed each year. The tonsils are glands that aid the immune system by blocking pathogens; when they were inflamed, it was a sign they were hard at work. These tissues formed the first line of defense against ingested or inhaled pathogens, such as polio.  Since polio was not as stymied by better sanitation as other diseases, it was able take advantage of the weakened immune systems of older children and adults. Still, polio, like most other infectious diseases, continued its downward trend of incidence prior to the introduction of its vaccine.

As the historical evidence indicates, vaccines simply speeded an already-occurring disappearance of infectious diseases in developed countries. Without advances in nutrition to ensure basic immune system function, and sanitation to prevent the spread of pathogens, infectious disease would probably have persisted despite vaccination.  Tuberculosis is a good example. During parts of the 19th century it was responsible for one quarter of all deaths in Europe. It no longer troubles the developed world, despite the fact that we never effectively vaccinated against it in America or Europe.  However, it still causes between 1.5 and 2 million deaths per year in impoverished countries whose citizens have poor health and sanitation, despite widespread vaccination in such countries.

In conclusion, while vaccines do possess varying degrees of effectiveness, and can help to reduce the incidence of disease, they are not our most important form of protection against disease.   As Bechamp theorized, the explosion of infectious disease in the 19thcentury was really due to a relatively brief, but steep, reduction in general health, which, when paired with unsanitary living conditions, made epidemics inevitable. Our strategy for acquiring better immunity to all diseases, or providing the conditions for such immunity to our children, should primarily be to maintain good nutrition and health through breastfeeding, consumption of natural whole foods, clean water, regular rest, regular exercise, and reduction of stress.

 

In next week’s article, entitled “Two Approaches to Vaccination,” I’ll discuss the underlying worldview behind the modern-day vaccine schedule and contrast it with a more holistic approach to public health.

Vaccination, an Overview, Part 3 The New Epidemic

In America today, what infectious diseases remain, such as the flu, are not as life-threatening, and infant mortality has drastically decreased from just a century ago. Children of today are highly likely to make it to adulthood. Coinciding with the reduction of infectious disease, however, has been a corresponding emergence of an entirely new kind of health problem in children: chronic disease. Children in ever greater numbers are suffering from immune system disorders and developmental delays which have no known cause or cure. Eczema, hives, hay fever and food sensitivities have been increasing since the 1920s, with rapid surges occurring in the 1960s and the 1980s, and these allergies now occur in the tens of millions. Asthma has been increasing since the 1960s, particularly in developed countries, and it now affects 6 million children in the U.S. Attention-Deficit Hyperactivity Disorder has tripled in incidence since the 1970s. Autism spectrum disorders have grown from 1 in 2,000 in the 1960s and 1970s to 1 in 150 today, with the greatest spike occurring from 1996 to 2007.   All of these increases in incidence are too great to be explained solely by genetic mutations (although genetic susceptibility does seem to be a factor) or by evolving diagnostic methods and definitions.  Consequently, an external, environmental agent (or agents) must be triggering them. Since these diseases are chronic but seem to be unassociated with any pathogen and not infectious, they cannot be explained by the germ theory of disease, and scientists possess no alternative theory that would explain what in our environment could be triggering these types of health problems.

It is worth noting that our environment has changed drastically over the last half-century. Our food, water and air are less likely to be contaminated by bacteria like tuberculosis or cholera but are more likely to contain pesticides and other potentially toxic chemicals. Children who used to run and play outdoors, using up their excess energy and exposing their immune systems to many different natural substances, from pollen to poison ivy, now spend most of their time indoors in school or sitting still in front of a screen at home. At the same time they have adopted diets high in excess calories and low in nutrients. Antibiotics and pasteurization have reduced the presence of both bad and good bacteria in their lives. This new lifestyle could be the culprit for children’s hypersensitive immune systems and hyperactive behavior, or it could at least be a contributor. When it comes to autism spectrum disorders, however, many parents believe that vaccines play a major role.

Vaccines have never been completely without side effects, and even the safest vaccines will cause temporary side effects (such as pain and swelling, fever, vomiting, diarrhea, rashes, headaches and crying) between 5% and 40% of the time.  Serious side effects are usually some form of inflammation: Guillain-Barre syndrome  (an autoimmune disorder causing paralysis) and encephalitis (inflammation of the brain).  However, these are said to be extremely rare. A vaccine for which the serious side effects were found to be relatively more common was the first combination vaccine, DTP (diphtheria, tetanus and pertussis), which was released on the market in 1946. In the 1970s and 1980s there was a growing awareness that the pertussis portion of the vaccine, which used a whole-cell B. pertussis bacteria, was responsible for a higher-than-expected rate of reactions such as convulsions, shock, cardiac distress and brain damage. In 1981 Japanese scientists developed a new vaccine that used a safer acellular pertussis component, and caused far fewer reactions, but this form of the vaccine was only adopted in the United States in 1996, after many years of lobbying by parents who had observed their children react adversely to the DTP vaccine.

As was the case with the DTP vaccine, suspicions of a link between autism and vaccines have their initial basis in the case reports of parents who see their children lose previously acquired mental and social skills following doses of vaccines, the majority of which are administered in the first two years of life, the same timespan in which autism usually appears. This correlation could be explained as a coincidence, but the issue is complicated by the fact the rates of autism have increased in conjunction with rising number of shots given to children. In 1983, for example, children received vaccines for diphtheria, tetanus, pertussis (given together as DTP), polio, and mumps, measles and rubella (given together as MMR). This schedule represented vaccines for 7 diseases in the first 4 years. There were 6 shots containing 18 doses of vaccines plus an additional 4 doses of the oral polio vaccine, totaling 22 doses of vaccines. In the year 1995 the schedule was largely the same, except for the addition of the vaccine against Haemophilus Influenzae Type B (HIB) a bacteria that causes meningitis. After that, however, the number of vaccines began to increase.  By 2007, children following the standard schedule were receiving 40 total doses of vaccines against 14 diseases, double what had been given a decade previously. At the same time the number of shots did not greatly increase, because new combination vaccines became available that combine four or five vaccines into one shot. The result has been a significant increase in the amount of foreign material injected into a child’s body at one time.

As discussed in last week’s newsletter, the ingredients of a vaccine must be carefully balanced and formulated in order for the vaccine to be both safe and effective. The typical vaccine components mentioned in the first section – the pathogen, the tissues in which it is cultured, an adjuvant to help stimulate immunity, and a preservative to protect the vaccine from additional pathogens – are each capable of causing unwanted side effects.Live viruses and bacteria, found in the DPT and MMR vaccines among others, are better able to stimulate immunity, but are more likely than weak or killed pathogens to cause a persistent infection and excessive inflammation, including inflammation of the brain (encephalitis) and subsequent brain damage.  Animal or human tissues in which pathogens are cultured contain proteins similar to those contained in our own tissues.  In reacting to the pathogen in a vaccine, some immune systems may see these proteins as part of the threat, and produce autoantibodies against them. These autoantibodies can’t tell the difference between the injected proteins and body’s proteins, resulting in chronic inflammatory autoimmune disease such as Guillain-Barre syndrome, arthritis or multiple sclerosis. The most typical adjuvant in vaccines, aluminum, is a metal that has been linked to Alzheimer’s disease, dementia and brain damage, and it may be difficult for some children to detoxify. As for preservatives, some vaccines contain formaldehyde, a carcinogen, and most vaccines previously contained thiomersal, a form of mercury, before vaccine manufacturers agreed to provide mercury-free vaccines upon request several years ago. Could these ingredients, as they are injected into children with greater frequency and in greater quantities, be responsible for the increasing incidence of chronic immune hypersensivity and developmental disorders in children?  Clearly, not all children have negative long-term reactions to vaccines; in fact, it seems that most of them don’t.  But might some children have a genetic susceptibility to having adverse reactions to vaccines, particularly when administered according to the current schedule?

What are the facts of the situation? First, vaccines carry the potential for adverse effects, including brain damage.  Second, there is a parallel between increasing autism rates and the increased number of vaccines given.  Last, autism typically emerges in children during the period of time when vaccines are administered.  What have we proved?  Nothing.  These facts are not proof of a causal relationship between vaccines and autism–they only show a correlation.  However, this correlation makes a causal relationship a possibility worth investigating, especially since no other cause of autism has been identified. Accordingly, many scientific studies have been done on whether a link between vaccines and autism exists. The initial safety studies done on each new vaccine by Merck, Sanofi Pasteur, Wyeth, and GlaxoSmithKline (the four large pharmaceutical companies that manufacture almost all vaccines), the results of which are reviewed by the FDA and the CDC’s Vaccine Adverse Events Reporting System (VAERS), have not found a link for any individual vaccine. Doctors and research scientists, most notably the independent, non-profit Institute of Medicine, have conducted many additional studies over the past two decades, as well as comprehensive reviews of earlier research, and the vast majority of them have also concluded that no link can be proven, thus confirming the scientific consensus that the serious side effects of vaccines are extremely rare and do not include autism.

The most famous study that did hint at a possible connection between vaccines and autism was published in 1998 in The Lancet, a British medical journal that is perhaps the most respected in the world. The lead author, Dr. Andrew Wakefield, and twelve of his colleagues, argued, based on observations of twelve children with both inflammatory bowel disease and autism, that the children might have a new syndrome caused by the vaccine-strain measles virus, which was found in their intestines. Because the children were previously normal, Dr. Wakefield suggested an environmental trigger might be the cause of the syndrome, and called for the MMR vaccine (measles-mumps-rubella combination) to be discontinued in favor of separate vaccines administered at separate times, until more research could be done. However, the British government felt that to do so would increase the exposure of children to the three diseases. The results of the study were widely reported in the news media, and with MMR remaining the only vaccine available, many parents did not vaccinate their children against the diseases at all.

In the years that followed, both Wakefield and the study received increasing criticism. Other scientists did similar studies and reported that they had failed to duplicate the results. A journalist investigating Wakefield found that he had ties to a lawyer preparing a lawsuit against the MMR manufacturers, and that he had a patent on a new measles vaccine, both indicative of serious conflicts of interest. Ten of the twelve co-authors eventually disowned the paper. Earlier this year, The Lancet itself finally retracted the paper, and Dr. Wakefield lost his license to practice medicine in the UK.

In light of this evidence, it would seem that the possibility of any link between vaccines and autism has been thoroughly eliminated. But for a variety of reasons, we must question the credibility of those who signed off on vaccine safety, who authored and reviewed pro-vaccine studies, and who have promoted vaccines in the media. To begin with, the general public has long had good reason to distrust the ethics and integrity of the pharmaceutical industry, which has been known to disguise or minimize knowledge of adverse reactions to its products (such as Avandia, Vioxx and Fen-Phen). It has also been known to aggressively market its products to as wide a customer base as possible — even urging in recent months, with governmental approval, cholesterol-lowering drugs on people who do not even have high cholesterol. Vaccines are a guaranteed lucrative investment, given that they are prescribed equally to almost every individual in the country.

An additional strike against the pharmaceutical companies’ assurances of safety is that they are not responsible for adverse side effects of the vaccines they manufacture. In the 1980s, as more parents whose children had been injured by the DPT vaccine began to bring lawsuits against vaccine manufacturers, those manufacturers threatened to stop making vaccines entirely, reasoning that it would be unprofitable to continue if they had to pay expensive personal injury claims. In order to ensure that vaccines remained available to the public, the U.S. government stepped in and passed the National Childhood Vaccine Injury Act, which set up a special government court for hearing vaccine injury claims, and awarding damages up to $250,000.  The damages are funded by proceeds from a tax on vaccines, thus shielding vaccine manufacturers from any financial liability. Claims are argued before a government-appointed judge rather than a jury, and while most claims are rejected, the court has had to award almost $2 billion in damages since its inception.

Clearly, pharmaceutical companies manufacture vaccines for profit, not out of an overriding concern for the safety of children. It is not likely that they would abandon profitable products such as vaccines even if they knew that such products caused relatively frequent and severe side effects–just as they knew, but kept secret, the fact that Avandia increased the risk of heart attacks, for example. It is therefore prudent not to accept at face value claims (and by claims, I mean advertising) by the vaccine manufacturers, and by the scientists whom they employ, that vaccines are extremely safe.

What about the government’s independent oversight and regulatory authority? Unfortunately, as in so many industries (including banking, energy, and health care) a revolving door of employment exists between the pharmaceutical companies and the federal authorities that regulate them. An example is Dr. Julie Gerberding, who directed the CDC from 1998 to 2009. This was the period during which the number of vaccines administered and the number of autism cases greatly increased. Dr. Gerberding waited exactly one year and one day after leaving the CDC – the legal minimum – before taking on the job of President of the Vaccine Division of Merck Pharmaceuticals. Gerberding, during her CDC tenure, heavily promoted Merck’s new-to-the-market HPV vaccine, Gardasil, as well as the safety and effectiveness of vaccines in general.

As for scientists and medical doctors who conduct research on the safety of vaccines, many rely on the financial support of the pharmaceutical companies to carry out their research.  Without that support, they would be unable to carry out wide-ranging, long-lasting epidemiological studies of vaccine reactions. The most vocal and media-friendly proponent of vaccine safety, Dr. Paul Offit of the Children’s Hospital in Philadelphia, happens to be the co-inventor of the Rotavirus vaccine RotaTeq (also manufactured by Merck).   Offit has received royalties totaling $182 million from RotaTeq alone.

The conflicts of interest described so far have their origin in greed, but some conflicts can arise from humanitarian motivations. Most public health officials have concerns that if doubts about vaccine safety are given a more thorough hearing, a majority of parents might choose to vaccinate their children less, or not at all (as we saw happen in the aftermath of the Wakefield study publication) and consequently return us to an era of epidemic disease rivaling that of the 19th and early 20th centuries. The authorities may be unwilling to give a fair hearing to the possibility of a vaccine-autism link to serve the greater good. It’s possible that, even if Dr. Wakefield was partly right in his conclusions, the government and scientific community may have been driven by these types of fears to dissect his work for errors and conflicts and to magnify those flaws.

With so many powerful institutions – pharmaceutical companies, government, and scientific bodies – motivated for a variety of reasons to disprove a link between vaccines and autism, it is unlikely that any individual scientist or pediatrician is willing to stake their reputation, potentially even their license to practice medicine, by publishing (or even conducting) a study indicating greater-than-reported side effects of vaccines.  Not only would funding for such a study be difficult to obtain, any flaws in its methodology will be far more heavily scrutinized than if it were to confirm what has already been promoted as scientific truth.

If so many conflicts of interest are at work, shouldn’t we expect to see weaknesses in the pro-vaccine studies?  In fact, on closer examination, many of the studies showing that vaccines are unrelated to autism have significant methodological flaws or are reported to have broader conclusions than they really do. To take a recent example, an epidemiological study by researchers from the University of Louisville School of Medicine was published in Pediatrics magazine on May 24th of this year, stating that giving children vaccines on schedule had no negative effect on long-term neurodevelopment. Most news outlets reported that the study had shattered the “myth” that a delayed or alternative vaccine schedule was safer than the standard, CDC-recommended schedule. However, the study was based on data from a 2007 study published in the New England Journal of Medicine intended to determine whether increased amounts of thiomersal in vaccines caused greater numbers of neuropsychological disorders. That study contained a disclaimer noting that children with autism spectrum disorders were specifically excluded from the data set. Consequently, such children were not examined in the recent study either, and the authors acknowledged that they were restricted in their ability to assess outcomes such as neuro-developmental delay, autism, and autoimmune disorders. The differences between the two groups that were compared were also not significant. Those who were placed in the “timely” group received the recommended 10 vaccines in their first seven months while the “untimely” group received an average of 8. The untimely group, though their shots were delayed, did not actually receive fewer vaccines at each doctor visit, and the study indicates that they may have missed vaccines for socioeconomic reasons rather than intentionally abiding by a different schedule. Finally, the study was only of children receiving shots from 1993 to 1997, the period just prior to that in which the number of vaccine shots increased dramatically.

While, as stated above, these types of omissions and flaws are characteristic of most of the pro-vaccine studies, the fact that Dr. Wakefield’s study has been discredited as well is not necessarily comforting for those wanting to be reassured about the safety of vaccines, as it indicates that his conflicts of interest, as well as an error-filled study, somehow escaped the notice both of the editors of the Lancet and of the dozen co-authors who participated in the research. It must be concluded that we cannot simply take for granted the results of scientific studies from even the best medical journals, having seen what happens when they are subjected to intense scrutiny.  And, above all, we must keep in mind that such scrutiny is not likely to be applied to studies that confirm the scientific consensus on vaccines.

To better determine whether a connection might exist between vaccines and autism, we would need a long-term study comparing the health problems of a control group of completely unvaccinated infants against another group that has the standard vaccine schedule, and possibly additional groups that follow selective or alternative vaccine schedules. No study of this type has yet been done.  Pro-vaccine groups argue that such a study would be unethical, assuming ahead of time that vaccines are safer than the alternative, though this is what the study would be meant to determine.  Though such a study would be expensive, anti-vaccine groups might be able to fund it, were it not for the fact that, having staked their reputations on a link between vaccines and autism, they could not be considered an objective sponsor. Perhaps the main obstacle, however, is that a study of this type would require a large number of children to go unvaccinated and potentially susceptible to disease, and no public or private institution would want to take responsibility and liability for these potential adverse effects. Of course, autism is itself an epidemic that must be addressed, but as long as its cause remains unknown, no institution is officially liable for it.  Only the families of autistic children bear the burden for it.

As the controversy rages on, fewer parents are taking the medical establishment (including the CDC) at its word.  On May 5th, 2010, the CDC announced the results of a study they had conducted on parental compliance with the current recommended vaccine schedule. The percentage of parents who refused or delayed at least one vaccine for their children had increased from 22% in 2003 to 39% in 2008. Why? The parents cited concerns about the safety of vaccines, particularly the risk of autism. If the risks of vaccines are in fact much greater than reported, these parents seem to be making the right choice. However, one must not forget the reason why we vaccinate in the first place: to protect our children from infectious diseases. Eliminating one of the possible causes of autism from your child’s life won’t do them any good if they suffer permanent damage or death from polio, measles, diphtheria, tetanus or meningitis. Therefore, suspecting that the side effects of vaccines may be greater than reported leaves us with no easy decision to make. The overarching question that remains is the same that has pursued us throughout human history: how do we safely protect our children from disease?

 

We’ll take a stab at answering that question in next week’s newsletter, “Building Immunity.”

 

Vaccination: An Overview (Parts 1 and 2)

1. How Vaccines Work

 

We live in a world permeated by microorganisms of all kinds – bacteria, fungi, even microscopic animals and plants. Microorganisms interact with human beings in a number of different ways, in many cases seeking us out as their hosts for mutual benefit. Probiotics, for example, are various species of bacteria that live in our intestines, helping us digest our food and absorb nutrients. But some viral and bacterial microorganisms, known as pathogens or germs, cause disease and death in their human hosts rather than coexisting in a mutually beneficial relationship. Vaccination is meant to be a way of protecting us from these pathogens.

Generally speaking, a vaccine is a biological solution, prepared in a laboratory, that contains a weakened or killed virus or bacteria. A person who receives a dose of a vaccine containing a microorganism becomes immune to the disease caused by that microorganism. For example, the measles vaccine grants immunity to the measles virus and thereby to the disease the virus causes. The vaccine accomplishes this by taking advantage of the amazing immune system that exists in the human body.

The immune system is a network of biological processes that combine to protect us from infectious agents such as the pathogens mentioned above. Components of the immune system include physical barriers like skin and mucus but also interior protective agents such as white blood cells and interferons (proteins that protect us from viruses). Our most complex and advanced form of immunity, known as adaptive immunity, involves antibodies (aka immunoglobulins). Antibodies are specific proteins that the immune system produces upon encountering a foreign substance such as a microbe (aka an antigen). An antibody enables the body to more quickly recognize and neutralize the antigen to which it corresponds. As a result, after just one encounter with a pathogen, we can become permanently immune to it upon any future encounters. In other words, due to our ability to produce antibodies, we are able to adapt to an attack such that the same attack won’t work on us twice.

When we are injected with a dose of a vaccine containing a weakened or killed virus or bacteria, the immune system kicks into gear and fights off the pathogen, at the same time producing antibodies against it. Ideally, the pathogen will be weak enough to pose no danger to the body, but strong enough to still stimulate antibody formation. That way, if we encounter the pathogen in the future, we’ll have the antibodies ready to fight it off regardless of its strength. In other words, we’ll be immune to it.

Most vaccines contain, in addition to the pathogen, the following ingredients: animal or human tissues, which serve as a medium in which the pathogen can be cultured; a preservative (such as thiomersal, a mercury-containing compound, or formaldehyde) to keep other pathogens from contaminating the vaccine; a stabilizer such as MSG, to prevent the vaccine from being damaged by heat, light, acidity or humidity; and an “adjuvant,” usually aluminum, which is a substance that increases the response of the immune system. These ingredients, which differ depending on the vaccines, are the result of many decades of research on how to make vaccines safe, effective, and cost-effective.

The most crucial balance to strike in making a vaccine is between a too-strong pathogen and a too-weak one. In the former case, the pathogen may overwhelm the recipient’s immune system, resulting in disease; in the latter case, the pathogen may not stimulate lasting immunity. For example, the oral polio vaccine, which used a live polio virus administered in a similar manner to the way the actual polio virus is contracted, actually caused polio and subsequent paralysis in a small number of children each year before it was discontinued in the early 2000s. For this reason many vaccines are injected, entering the body via the bloodstream, and feature weakened or killed pathogens, relying partially on the afore-mentioned adjuvants for additional stimulation of the immune system. However, this method, presumably because it bypasses certain aspects of the immune system, sometimes does not result in lasting antibody production, in which case it does not confer permanent, lifelong immunity in the subject (hence the need for recurrent “booster shots” of certain vaccines ). In contrast, immunity from a naturally contracted infection is more likely to be permanent, but the risk of serious disease is much greater when acquiring immunity in this way. This dilemma of safety versus effectiveness, of stimulating immunity without harming the patient, has been present since the earliest and most rudimentary attempts at vaccination.

 

2. The History of Vaccination

 

Observing the progress of the Plague of Athens in 430 BC, the Greek historian Thucydides wrote that the plague (now thought to be typhus) “never took any man the second time so as to be mortal.” Those who got sick but survived did not have to fear dying from the disease later on. Similar observations of adaptive immunity may have been what led seventh century Buddhist monks to adopt the practice of drinking a small amount of snake venom to make them immune to the poison from an actual bite. In ancient China, the most threatening disease was smallpox, and by the 10th century one Buddhist nun had found a method for treating smallpox with inoculation. Inoculation, a more general term than vaccination, is the placement of something into a medium in which it can grow and reproduce, such as a plant part grafted on to another plant, or an antigen into a human body. Inoculation with smallpox for immunization purposes is known as variolation.  Over the next few centuries, variolation became common practice in China as a means of providing some protection against smallpox.

Ancient Chinese methods of variolation generally consisted in drying and pulverizing smallpox scabs from people with mild cases of smallpox and blowing the scab powder into the nostrils of healthy people. The mild cases were chosen for the same reason that vaccine makers now often use weakened or killed pathogens: to reduce the risk of inducing a serious infection. Another form of variolation was to have healthy children wear the undergarments of infected children for several days – a tactic similar to the chickenpox playdates of the 20thcentury, prior to the invention of the chickenpox vaccine.

Similar forms of variolation were eventually practiced in India, Byzantium and the Middle East. Due to various causes including the Crusades, the slave trade, and other forms of trade, smallpox spread to Europe and the Americas, and variolation followed.  Variolation techniques now included applying smallpox scab powder to cuts or scratches on the skin, and the process was slowly accepted in the West as a preventative against the disease, though many distrusted it based on its Oriental origin. The major drawback of variolation, however, was that people occasionally developed serious cases of smallpox from the procedure, and either died or suffered scarring and blindness. People sometimes feared the preventative almost as much as the disease itself.

In the eighteenth century, smallpox was widespread throughout England, but one group of people were curiously unaffected by the disease: dairy workers. Through their contact with cows, dairy workers typically became infected with cowpox, a disease similar to smallpox but much less dangerous, which was spread by touch from the infected udders of cows to humans. Cowpox was similar enough to smallpox that the antibodies produced by the infected workers could fight off smallpox microbes as well as cowpox microbes. One of the first people who took advantage of this phenomenon to deliberately induce immunity was an English dairy farmer, Benjamin Jesty.  In the year 1774, during a local smallpox epidemic, Jesty infected his family with the cowpox virus that had already infected his servants and workers. The family easily recovered from the cowpox virus and were untouched by smallpox.

Other farmers carried out similar experiments with success. Eventually, word of this immunization method reached the surgeon and scientist Edward Jenner, who in 1796 decided to test it out by inoculating his gardener’s eight-year-old son with pus from a milkmaid’s cowpox blisters, and then deliberately injecting him with smallpox (scientists had a little more leeway to experiment freely back then).  Since the smallpox virus did not appear to affect the boy, Jenner announced that he had been successfully “vaccinated,” deriving the term fromvacca, Latin for “cow.” Jenner continued to test vaccination on dozens of additional subjects with immediate success, and thanks to his connections in scientific and government circles, was able to widely publicize his findings. He also founded an institution to promote his method, and the British government soon banned variolation in favor of vaccination.

Over the course of the 19th century, vaccination against smallpox became standard practice in most European countries, and was in some cases mandatory. However, smallpox epidemics continued, particularly during times of stress and upheaval.  During the Franco-Prussian war of 1870-72, a smallpox epidemic struck France and Germany and killed over 100,000 people. Jenner himself became aware that both the safety and the effectiveness of the smallpox vaccine were less than ideal. He had discovered that a significant number of people still developed smallpox even after vaccination. They also sometimes became infected with other diseases that had contaminated the vaccine. As for the immunity from vaccination, it generally only lasted 3-5 years and then began to decline.

What Jenner did not know was the nature of smallpox and how it was transmitted. Only by the end of the 19th century did scientists investigating both smallpox and the many other infectious diseases that were prevalent at the time (tuberculosis, diphtheria, cholera and typhus, among others) come up with the famous germ theory of disease. The germ theory stated that each individual infectious disease was caused by an individual, microscopic, living organism. The noted French chemist Louis Pasteur was a major contributor to the theory, having proven that microscopic organisms, good and bad, do not generate spontaneously but reproduce by subsisting on nutrients, and can be airborne or anaerobic. Pasteur subsequently put his discoveries to use in developing pasteurization, the method of heating liquids to kill most microorganisms present within them.

 

The germ theory of disease enabled scientists to more easily develop vaccines against infectious diseases besides smallpox. Pasteur himself worked on vaccines against rabies and anthrax. Aided by his expertise in microbiology, he discovered methods for attenuating (weakening) bacteria in vaccines so that the vaccines could confer immunity with less risk of actually causing disease.  In the following decades, scientists further refined and improved the techniques of vaccine development, introducing vaccines for diphtheria, tetanus, and whooping cough prior to World War II. A polio vaccine was developed during the early 1950s. Since then, vaccines have been developed for many other infectious diseases: measles, mumps, rubella, hepatitis A and B, meningitis, chickenpox, flu and most recently HPV and rotavirus. Today, each disease against which we routinely vaccinate has a small or nonexistent incidence in the developed world. If the 19th century was the Age of Infectious Disease, the 20th century was the Age of the Vaccine.

 

Health Food Store Shopping List

In the old days, health food stores were small, grungy, lovable, hole–in–the–wall establishments that carried a few basics for health food nuts: organic carrots, tofu, brown rice, sea vegetables, carob chips, etc. As healthy eating became more popular, these stores multiplied to the point where almost every major town in America had a local health food store. With that multiplication came expansion: in addition to rice, beans and greens, you could also acquire healthier versions of the chips, crackers and cookies carried by conventional supermarkets.

In recent years, the health food store market has been cornered by Whole Foods, a mega–chain that drove many smaller stores out of business. While Whole Foods has made health food more accessible for many people, it may have missed the point of the original health food store. At many Whole Foods stores, it has become almost impossible to find bulk brown rice, or macrobiotic foods, within the countless aisles of organic soda or breakfast cereal made with cane sugar instead of corn syrup. While those processed foods are better than the counterparts you’d find at the local Walmart, they signify that just because a food is sold in a health food store doesn’t mean it’s healthy. What follows is my slimmed–down guide to the essential foods you need from your local health food store:

Fruit. The best tasting and most nutritious fruit is fresh, local and organic, qualities which you can usually only find in a health food store or at a farmer’s market. Make local and in season your priority, followed by organic. Fruit can be expensive, but it’s worth the cost. If you need to, shop at a conventional supermarket for fresh fruit rather than go without entirely.

Vegetables. Health food stores carry a wide variety of fresh, local and organic vegetables. When purchasing vegetables, you should try to incorporate a variety of different vegetable groups, which include greens (such as kale or collards), roots (like carrots and beets), bulbs (ex. onions or celery), gourds (squashes), and nightshades (tomatoes, potatoes, eggplant, etc). Vegetables should generally be stored in the crisper at the bottom of your refrigerator. Greens need to be kept in a plastic bag with the air pressed out, and they should be wrapped in a paper towel or two first so that the water on them is soaked up. Greens can be kept until they turn yellow (which takes about a week or two). Other vegetables will stay good for several weeks but generally it’s best to use them quickly, as they lose nutrition over time.

Grains and Beans. Whole grains and beans are kept in bulk bins, and are very inexpensive when purchased this way. You can get much of your calories and protein from these foods, which take some time to cook (especially beans, which need to be soaked and then boiled), but they last a long time and once cooked, can keep in the fridge for several days. Uncooked dry beans and grains keep for a year or more.

Nuts and Dried Fruit. These are good choices for snack foods, but they are not meal replacements. Only snack between meals if you’re still hungry despite having had a solid breakfast, lunch and dinner. These foods can also be expensive. Don’t try to cut back on fresh fruit, vegetables, or all–natural animal products so that you can buy snack foods.

Dairy Products. Raw dairy products from healthy cows are best, but in most states these cannot be sold in stores. Cheese is an exception—choose unpasteurized cheese when it is available. If you don’t have a source of raw milk, grass–fed organic pasteurized milk is the next best thing (non–homogenized is good too). Butter is best when cultured and unsalted. Yogurt should have no added sugar; add your own natural sweetener (such as raw honey) instead.

Fish. Fish is very good for you if it is wild caught, rather than farm raised. Wild caught fish is more expensive, so you may have to have it only occasionally, which is okay, especially since mercury in many species of fish is a concern. Sardines are a low–mercury, less expensive option.

Poultry, Pork, Beef, Eggs. Meat and eggs can be an important part of your diet and a good source of protein and fat; the meat must come from a healthy animal. For poultry, choose organic and free–range (or at least free–range), and hormone and antibiotic free. Pork should be organic if possible. Beef should be organic, but more importantly, grass–fed. Eggs should be from organic and free–range chickens. Be sure to check out local options.

Herbs and Spices, Salt, Natural Sweeteners. All of these condiments should be staples in your kitchen. Start building up a collection of herbs and spices and natural sweeteners (esp. raw honey), and use sea salt instead of regular salt. Buying herbs and spices in bulk is more cost–effective and you can buy less of the ones you won’t use as much. Most health food stores have a bulk spice section separate from bulk grains and beans.

Macriobiotic Foods. The original standbys of the health food store, these foods can now be difficult to find. However, they are usually grouped together, and include tamari (a natural form of soy sauce), brown rice vinegar, umeboshi plum paste, tekka, gomashio, and sea vegetables (nori, kombu, wakame, arame, hijiki). In the cold section you can also usually find the macrobiotic foods miso, tofu, tempeh, seitan, and mochi. These foods originate in the traditional Japanese diet and are all very nutritious and beneficial to health.

Oils, vinegars, sauces, nut butters, pastas, pickles, etc. Not all foods you buy need to be whole foods. It’s not convenient to buy your own olives to make olive oil, or grind your own peanuts for nut butter, for example, and you wouldn’t necessarily come up with a better quality product. In this sense some pre–made foods are perfectly fine, as long the ingredients themselves are whole foods. Peanut butter that contains peanuts and salt—fine. Peanut butter containing peanuts, salt, and hydrogenated vegetable oil—not so good. Pasta that is made from whole wheat is much better than pasta made from white flour. Olive oil should be unrefined and unfiltered rather than filtered and refined. Generally, the fewer ingredients—and the more whole–food the ingredients—the better.

Bread. People are always confused about what bread to buy, but the answer is fairly simple. Choose bread that is made from 100% whole grain flour (i.e. whole wheat, whole rye, etc.). If it says simply “wheat flour” or “whole wheat flour and white flour,” skip it. 100% whole grain flour molds quickly, since it is so nutritious, so keep it in the refrigerator or freezer. In fact, many whole grain breads are kept in the freezer section of the health food store. Don’t be afraid to try something new!

Supplements. If you eat the foods listed above, you really don’t need supplements. You may occasionally benefit from certain herbs, if you happen to be sick. But the most important thing is to be eating a good diet so you don’t get sick in the first place. When it comes to injuries such as bruises, cuts, stings/bug bites, and burns, the supplement section has some effective remedies such as arnica, calendula, stingstop, and aloe vera.

If any of the foods listed above catch your attention, in that you’ve never heard of them or at least have no idea how to incorporate them into your diet, then be sure to contact me with your questions!