Vaccination, an Overview, Part 5: Two Approaches to Vaccination

The lack of knowledge regarding the primary cause of the epidemics of disease in the previous centuries – that is, malnutrition and exhaustion leading to weak health – can be attributed to the medical establishment’s general emphasis on the symptoms, rather than the causes, of illness. This emphasis is not a feature solely of modern medicine or even of mainstream medicine. Throughout history, both mainstream doctors and alternative health practitioners have sought to provide us with “quick fixes” and convenient solutions to the symptoms of problems initially caused by poor diet and lifestyle, rather than recommending a diet and lifestyle that would help prevent such problems in the first place.

The first flaw in this approach is that the medical interventions intended to eliminate our symptoms invariably have significant side effects of their own, which then require further interventions with additional side effects. A second flaw is that in situations where such interventions are not available, if we have not been taught how to take care of our health, we are entirely at the mercy of disease.

            We can think of suffering a severe reaction to an infectious disease as a “symptom” of being in poor health in the first place. However, the medical establishment, by minimizing the role played by a healthy immune system, implies that we are all equally defenseless against pathogens. The consequences of this approach are the ever-increasing vaccine schedule and also our society’s “germophobia.” In fairness, in recent decades, acknowledgement of the importance of diet and lifestyle on healthy immune function, and of the abilities of the immune system itself, has increased. But when flu season strikes, the message we hear is primarily to get vaccinated and to minimize the spreading of germs. Just as is frequently the case with diet, we try to eliminate what’s bad but we don’t try to replace it with what’s good.

In taking the vaccinate-and-sanitize approach to fighting disease, rather than an approach that promotes health, we may simply have shifted the battleground. Part 3 already discussed the real and potential debilitating side effects of the vaccines we administer to our children and ourselves.  As for sanitization, while it can eliminate potentially harmful germs from the environment, it often does so by means of toxic chemicals. And lack of exposure to germs is likely to result in oversensitive immune systems that will react negatively to pollen and commonly eaten foods.

In accordance with the second flaw of the symptom-focused approach, children are left vulnerable in instances where vaccines do not succeed in providing immunity, where they are unavailable, or where they have not been invented. And when unhealthy children enter  germ-laden environments, as is inevitable given that they spend more time in the doctor’s office or hospital, they are at serious risk. Consider that in the United States an estimated 80,000 individuals per year die from infections received while in hospitals, most frequently due to catheters, which is not unsurprising given that hospital workers only wash their hands about 70% of the time. While our general environment, and our hospitals as well, are far more hygienic now than they were 150 years ago, and we are generally better nourished as well, we still have a long way to go.

As a short-term strategy, vaccinate-and-sanitize saves lives. But like most approaches that only seek to address the symptoms, not the causes of our health problems, it inevitably results in new, mysterious health problems. At best, focusing on vaccines largely maintains the status quo.  The Bill & Melinda Gates foundation is an excellent example. The world’s largest private foundation, it possesses an endowment of $33 billion, and devotes a significant portion of its resources to improving global health. Vaccination and medication programs in third-world countries are the primary beneficiaries of these resources. However, after decades of giving, vaccine-preventable and other diseases still persist in these countries, and the individuals receiving vaccines and medications often lack basic needs such as nutrition, clean water, and transportation. If they can even access the medications, they may not have enough food to digest them. The wealth of the foundation is ultimately directed to the already wealthy pharmaceutical countries, while the residents of third world countries remain malnourished and impoverished.

In the long run, symptom-focused strategies tend to benefit more those who promote them than those who are subject to them.

A better way to handle the threat of infectious disease would be to create the conditions for healthy, strong immune systems in children.  As discussed in Part 4, these conditions include eating a diet based on whole foods (and breast milk in the case of infants); drinking clean water as the primary beverage; getting enough rest and enough exercise; and reducing stress. Like a muscle, the immune system must also be exercised in order to be strong.  Natural “vaccination,” or technically, immunization, can occur when we are exposed from a young age to a wide variety of microbes in raw and fermented foods, in breast milk, even in dirt. It is a process not too different from that by which Benjamin Jesty’s dairy workers were naturally protected from smallpox.  In fact, a healthy child’s natural exposure to more mild infectious diseases, such as chickenpox, may be beneficial for healthy immune development. A well-nourished child with a well-exercised immune system is strongly equipped to handle the pathogens he or she is likely to encounter, and is likely to be one of the vast majority of children who do not suffer severe reactions to more serious diseases such as measles, or even polio or diphtheria.  A child who, in contrast, is raised on a diet of largely processed food, with little exposure to beneficial bacteria, has a sedentary lifestyle, and suffers from frequent colds and ear infections which are treated with antibiotics instead of fought by the immune system, is more likely to have a severe reaction to a strong pathogen and, in the absence of a change in diet and lifestyle, would probably benefit from the protection of most vaccines, despite their potential side effects.

Unfortunately, the well-nourished child is far more rarely seen in our society than his or her conventionally-nourished counterpart. The medical establishment has, for the most part, chosen neither to study nor promote practices that make us more healthy and consequently less dependent upon vaccines, medications, supplements, sanitizers, and surgery.  Most doctors acknowledge that whole foods are better than processed foods, breast milk superior to infant formula, and some exercise better than none. Some may even recommend playing in the dirt over sitting inside all day. But junk foods continue to infiltrate our schools just as recess programs disappear from them. Infant formula is pushed on women whose babies are not growing at rates arbitrarily determined to be acceptable. Many people consider it inconvenient to try and be healthy or to allow the immune system to fight disease, which is the primary reason why a chickenpox vaccine, for example, was invented. As long as our attitude towards health is symptom-focused, and values short-term convenience over long-term wellness, our health will remain vulnerable and our need for medical interventions will grow.

In the late 19th century, during the worst epidemics of disease, the author and social theorist Leo Tolstoy wrote, in an essay entitled Modern Science:  “The defenders of present-day science seem to think that the curing of one child from diphtheria, among those Russian children of whom 50% (and even 80% in the Foundling Hospitals) die as a regular thing apart from diphtheria must convince anyone of the beneficence of science in general…our life is so arranged that from bad food, excessive and harmful work, bad dwellings and clothes, or from want, not children only, but a majority of people, die before they have lived half the years that should be theirs…And, in proof of the fruitfulness of science, we are told that it cures one in a thousand of the sick, who are sick only because science has neglected its proper business.” According to Tolstoy, though science was an incredibly valuable tool, when misdirected it did not benefit humanity. To be beneficial to us, medical science must be guided by wisdom and foresight, rather than shortsightedness, and should possess a healthy respect for and inquisitiveness into the capacities of the healthy human body.

The ultimate question of whether and how to vaccinate your child is a difficult one, and the answer is not the same for everyone. There is no utterly risk-free approach to take; even the healthiest person can still succumb to a powerful pathogen, as can the most thoroughly vaccinated person. Your decision must involve an awareness of your child’s likely susceptibility to each disease against which we vaccinate, and a calculation of the benefits of the vaccine against the risks of its possible side effects. Regardless of your decision, however, the best thing you can do for your children is to make the necessary changes in your diet and lifestyle for the promotion of health. For further reading on the risks and benefits of vaccines as well as strategies for strengthening the immune system, I suggest you consult one or more of the books listed below.

 

Vaccinations: A Thoughtful Parent’s Guide by Aviva Jill Romm (2001). Discusses vaccines from a historical perspective and contains natural and herbal remedies for common childhood diseases as well as recommendations for building immunity naturally. Romm is a certified professional midwife, a practicing herbalist, and a physician.

 

What Your Doctor May Not Tell You About Children’s Vaccinations by Stephanie Cave, M.D. (2001). Explores the possible relationships between vaccines and autoimmune diseases/developmental disorders, and contains an overview of vaccines and the legal issues related to them, as well as an alternative vaccine schedule.

 

The Vaccine Book by Robert Sears, M.D. (2007). Contains a detailed guide to the current vaccine schedule, including a discussion of the severity and rarity of each disease and the ingredients and side effects of each vaccine. Also contains an alternative vaccine schedule.

 

The Vaccine Guide by Randall Neustaedter, O.M.D. (1996, 2002). Provides an extensive and technical overview of research on the safety of vaccines and the results of that research.

 

How to Raise a Healthy Child in Spite of Your Doctor by Robert Mendelsohn, M.D. (1984). Covers the most common childhood ailments and the appropriate treatments for them. Also contains a section on diseases commonly vaccinated against, their severity, and the effectiveness of the vaccines.

 

Vaccination, an Overview, Part 4 Building Immunity

As discussed in Part 1, vaccines are designed to stimulate the immune system. In fact, their effectiveness really comes from triggering the body’s own natural processes of adaptive immunity. The underlying assumption of vaccination is that the immune system is inherently not likely to be strong enough to handle a disease when it encounters it in nature, which is why we need vaccines to safely and artificially engineer the encounter with disease. This assumption, which has its origin in the aforementioned germ theory of disease, is perhaps an understandable one. In the late 19th century, scientists had observed epidemic after epidemic of infectious disease resulting in millions of casualties. It was reasonable in them to consider that the pathogens they had discovered were indiscriminately deadly. However, one scientist of the era, a French biologist named Antoine Bechamp, had a different proposition for why people were succumbing to infectious disease at great rates: he pinned it on their weak health.

Bechamp, a contemporary of and influence upon Pasteur, would have agreed with Pasteur’s arguments that methods of sanitization (such as hand-washing and pasteurization) would prevent the spread of disease by eliminating pathogens from the local environment. However, Bechamp’s theory was that most people who suffered from infectious disease did so because their own bodies were, in a sense, “unsanitized,” but on a cellular level. According to Bechamp, when we are in a diminished state of health, our cells and tissues form a breeding ground for microorganisms (or microzymas as he called them) that are largely already present in our bodies, but which do not take on a harmful form or reach harmful levels without the supportive environment provided by an sick individual.  Bechamp’s theory formed a contrast to an interpretation of the germ theory that identified external pathogens as being the sole and direct cause of infectious disease, regardless of the prior health of the diseased person.

Meanwhile, Robert Koch, a contemporary of Bechamp and Pasteur, had formulated four postulates meant to establish a causal relationship between a unique pathogen and a unique disease. The postulates specified the following: (1) the pathogen should be in all organisms suffering from the disease, but not in healthy individuals; (2) it must be possible to isolate the pathogen and grow it in a pure culture; (3) it must cause the disease when introduced into a healthy organism; and (4) it must then be possible to re-isolate it from the inoculated organism and found identical to the original. Koch later had to change the first postulate after finding cases of healthy, asymptomatic organisms carrying the bacteria that cause cholera and typhoid fever, respectively. He also had to change the third postulate after finding that not all organisms exposed to a pathogen will display symptoms of infection.

Koch’s findings indicated that both Bechamp’s and Pasteur’s theories had some merit. Pasteur’s disease-centered approach, which relied on sterilization, pasteurization, quarantine, and sanitation, was focused on preventing the spread of disease by eliminating the pathogen from the external environment. Bechamp’s health-centered approached was based on making the individual stronger and healthier, and thereby better able to prevent pathogens from gaining a foothold within the environment of the human body. Although the specific mechanism of Bechamp’s theory – that of microzymas arising from our own tissues to form pathogens – has never been proven, scientists have since discovered that our health plays a tremendous role in the effective functioning of our immune systems, and consequently affects how easily we succumb to infections.

The human immune system is a conglomerate of many different body parts and processes. The skin, liver, kidneys, respiratory tract, intestinal flora and more all play a role in destroying pathogens by means of inflammation, white blood cells, and antibacterial or antiviral chemicals and enzymes.  Those pathogens are discharged via the cleansing and flushing action of tears, urine, mucus and diarrhea. The cells that form the adaptive part of the immune system are able to retain memories of specific pathogens and thereby easily neutralize those pathogens with antibodies upon future encounters.

All systems, mechanical or biological, cannot work properly unless supplied with the proper fuel or raw materials.  A car cannot run without fuel, nor an ecosystem without water, oxygen, and sun.  Our immune system is no different; in order to function, it must be providedwith needed nutrients. Vitamin D, the antioxidant vitamins A, C, and E, Vitamin B6, folic acid, and the minerals zinc, copper, iron selenium have all been found to be vital for promoting the health of the immune system, as have the beneficial bacteria contained in raw fermented foods. Other nutrients contained in whole foods that are as yet unstudied or even undiscovered may be similarly essential. Where infants are concerned, breast milk provides, in addition to needed nutrients, a variety of immunologic factors such as immunoglobulins (antibodies), the enzymes lysozyme and lactoferrin, and lymphocytes (white blood cells). These ingredients promote not only the health but also the growth and strengthening of the infant immune system and protects against the routine infections that are much more commonly seen today in babies that are fed formula, which does not contain immunological factors. Along with nutrition, people need a certain amount of rest and sleep, as well as moderate exercise and clean water, in order to maintain healthy immune function. Stress, extreme conditions, exhaustion and dehydration all weaken the immune system, and they also weaken a nursing mother’s ability to provide nourishing milk.

The scientists who were formulating the germ theory of disease in the late 19th century were living during the tail-end of the Industrial Revolution, a period of enormous social, political and technological change. Economies in Europe and America had shifted from an emphasis on rural agriculture to one on urban industry.  In England in 1750, only 15% of people were living in urban areas, but by 1860, that number had risen to 80%.  Within the cities, the lower classes (both children and adults) had started working long hours in factories for little pay, often doing heavy labor in extreme conditions. As a consequence they were frequently exhausted and, as a result of their poverty, malnourished.  The upper classes, on the other hand, deliberately chose to eat newly available refined foods that were low in nutrients and high in calories, and many women did not get enough exercise or sunlight. They too were weak and sickly and prone to death in childbirth. In sum, the majority of people living during this era were in poor health, with low functioning immune systems, and thereby had reduced resistance to disease.

At the same time, the cities to which so many had relocated lacked the proper waste disposal systems for handling such large populations. Consequently, pathogens were able to contaminate the air, water, food and the streets.  Technology had developed in such a way as to ease the transmission of infectious disease without yet possessing a means to  prevent it. Doctors themselves were some of the worst transmitters. Yet unaware of the need to wash their hands (in many cases outright rejecting the idea), they easily spread fatal pathogens to the many patients, particularly mothers in childbirth, whom they treated in busy urban hospitals.  It is no surprise that infectious diseases ran rampant and that infant mortality was around 40% on average, with the highest rates occurring in the cities.

With the formulation of the germ theory of disease, sanitary practices such as Pasteur and the physician Ignaz Semmelweiss proposed were grudgingly accepted by physicians, with positive results. However, the theory ultimately focused much more on the danger of microbes than the ability of the healthy human body to resist them. As a result, scientists and government officials complemented sanitary practices by arguing the need for vaccines, rather than following Bechamp’s lead in promoting a healthy diet and lifestyle that would simply strengthen the immune system.

Fortunately, due to the explosion of nutrient deficiency diseases during the same period of time, vitamins were gradually discovered and added back into the processed foods from which they had recently been removed. This resulted in better nutrition, which, together with advances in sanitation technology, greatly improved overall health and hygiene in Europe and America following the turn of the century, though the conditions for epidemics were still occasionally created by destabilizing events such as World War 1 and the Great Depression. Smallpox, cholera, tuberculosis, diphtheria, scarlet fever, typhoid fever and other infectious diseases all began to decrease, whether vaccines had been developed for them or not. Endemic diseases like measles, mumps, rubella and chickenpox persisted, but were far less likely than before to cause complications or fatalities.

The only disease to cause epidemics in the developed world into the mid-20th century was polio. Polio, like many of the epidemic diseases of the time, had been around for thousands of years without ever being responsible for major epidemics prior to the late 19thcentury. Since 90% of polio infections cause no symptoms at all, deaths and paralysis from polio were rare. All that changed with the onset of the industrial revolution and the weakened health of the population; suddenly, polio could spread easily, and it met with little resistance in its victims. As sanitation began to improve, fewer people were exposed to the polio virus as young children, when they are least likely to suffer harm from it, and when they can acquire long-term immunity. But while the number of people exposed to polio was lessened, the number of those who died or suffered paralysis increased, since those who did not develop immunity as children encountered it as teenagers or adults, when the disease is more severe.

Additional factors in the severity of the polio epidemic were the rising fads of formula feeding and the tonsillectomy procedure. By 1950 over half of all babies were being fed infant formula (lacking polio antibodies, naturally), which was being promoted as better than breast milk by now-discredited scientific studies. It was around this time as well that performing tonsillectomies became a fad among surgeons and doctors, and in the 1930s and 1940s between 1.5 and 2 million tonsillectomies were being performed each year. The tonsils are glands that aid the immune system by blocking pathogens; when they were inflamed, it was a sign they were hard at work. These tissues formed the first line of defense against ingested or inhaled pathogens, such as polio.  Since polio was not as stymied by better sanitation as other diseases, it was able take advantage of the weakened immune systems of older children and adults. Still, polio, like most other infectious diseases, continued its downward trend of incidence prior to the introduction of its vaccine.

As the historical evidence indicates, vaccines simply speeded an already-occurring disappearance of infectious diseases in developed countries. Without advances in nutrition to ensure basic immune system function, and sanitation to prevent the spread of pathogens, infectious disease would probably have persisted despite vaccination.  Tuberculosis is a good example. During parts of the 19th century it was responsible for one quarter of all deaths in Europe. It no longer troubles the developed world, despite the fact that we never effectively vaccinated against it in America or Europe.  However, it still causes between 1.5 and 2 million deaths per year in impoverished countries whose citizens have poor health and sanitation, despite widespread vaccination in such countries.

In conclusion, while vaccines do possess varying degrees of effectiveness, and can help to reduce the incidence of disease, they are not our most important form of protection against disease.   As Bechamp theorized, the explosion of infectious disease in the 19thcentury was really due to a relatively brief, but steep, reduction in general health, which, when paired with unsanitary living conditions, made epidemics inevitable. Our strategy for acquiring better immunity to all diseases, or providing the conditions for such immunity to our children, should primarily be to maintain good nutrition and health through breastfeeding, consumption of natural whole foods, clean water, regular rest, regular exercise, and reduction of stress.

 

In next week’s article, entitled “Two Approaches to Vaccination,” I’ll discuss the underlying worldview behind the modern-day vaccine schedule and contrast it with a more holistic approach to public health.

Vaccination, an Overview, Part 3 The New Epidemic

In America today, what infectious diseases remain, such as the flu, are not as life-threatening, and infant mortality has drastically decreased from just a century ago. Children of today are highly likely to make it to adulthood. Coinciding with the reduction of infectious disease, however, has been a corresponding emergence of an entirely new kind of health problem in children: chronic disease. Children in ever greater numbers are suffering from immune system disorders and developmental delays which have no known cause or cure. Eczema, hives, hay fever and food sensitivities have been increasing since the 1920s, with rapid surges occurring in the 1960s and the 1980s, and these allergies now occur in the tens of millions. Asthma has been increasing since the 1960s, particularly in developed countries, and it now affects 6 million children in the U.S. Attention-Deficit Hyperactivity Disorder has tripled in incidence since the 1970s. Autism spectrum disorders have grown from 1 in 2,000 in the 1960s and 1970s to 1 in 150 today, with the greatest spike occurring from 1996 to 2007.   All of these increases in incidence are too great to be explained solely by genetic mutations (although genetic susceptibility does seem to be a factor) or by evolving diagnostic methods and definitions.  Consequently, an external, environmental agent (or agents) must be triggering them. Since these diseases are chronic but seem to be unassociated with any pathogen and not infectious, they cannot be explained by the germ theory of disease, and scientists possess no alternative theory that would explain what in our environment could be triggering these types of health problems.

It is worth noting that our environment has changed drastically over the last half-century. Our food, water and air are less likely to be contaminated by bacteria like tuberculosis or cholera but are more likely to contain pesticides and other potentially toxic chemicals. Children who used to run and play outdoors, using up their excess energy and exposing their immune systems to many different natural substances, from pollen to poison ivy, now spend most of their time indoors in school or sitting still in front of a screen at home. At the same time they have adopted diets high in excess calories and low in nutrients. Antibiotics and pasteurization have reduced the presence of both bad and good bacteria in their lives. This new lifestyle could be the culprit for children’s hypersensitive immune systems and hyperactive behavior, or it could at least be a contributor. When it comes to autism spectrum disorders, however, many parents believe that vaccines play a major role.

Vaccines have never been completely without side effects, and even the safest vaccines will cause temporary side effects (such as pain and swelling, fever, vomiting, diarrhea, rashes, headaches and crying) between 5% and 40% of the time.  Serious side effects are usually some form of inflammation: Guillain-Barre syndrome  (an autoimmune disorder causing paralysis) and encephalitis (inflammation of the brain).  However, these are said to be extremely rare. A vaccine for which the serious side effects were found to be relatively more common was the first combination vaccine, DTP (diphtheria, tetanus and pertussis), which was released on the market in 1946. In the 1970s and 1980s there was a growing awareness that the pertussis portion of the vaccine, which used a whole-cell B. pertussis bacteria, was responsible for a higher-than-expected rate of reactions such as convulsions, shock, cardiac distress and brain damage. In 1981 Japanese scientists developed a new vaccine that used a safer acellular pertussis component, and caused far fewer reactions, but this form of the vaccine was only adopted in the United States in 1996, after many years of lobbying by parents who had observed their children react adversely to the DTP vaccine.

As was the case with the DTP vaccine, suspicions of a link between autism and vaccines have their initial basis in the case reports of parents who see their children lose previously acquired mental and social skills following doses of vaccines, the majority of which are administered in the first two years of life, the same timespan in which autism usually appears. This correlation could be explained as a coincidence, but the issue is complicated by the fact the rates of autism have increased in conjunction with rising number of shots given to children. In 1983, for example, children received vaccines for diphtheria, tetanus, pertussis (given together as DTP), polio, and mumps, measles and rubella (given together as MMR). This schedule represented vaccines for 7 diseases in the first 4 years. There were 6 shots containing 18 doses of vaccines plus an additional 4 doses of the oral polio vaccine, totaling 22 doses of vaccines. In the year 1995 the schedule was largely the same, except for the addition of the vaccine against Haemophilus Influenzae Type B (HIB) a bacteria that causes meningitis. After that, however, the number of vaccines began to increase.  By 2007, children following the standard schedule were receiving 40 total doses of vaccines against 14 diseases, double what had been given a decade previously. At the same time the number of shots did not greatly increase, because new combination vaccines became available that combine four or five vaccines into one shot. The result has been a significant increase in the amount of foreign material injected into a child’s body at one time.

As discussed in last week’s newsletter, the ingredients of a vaccine must be carefully balanced and formulated in order for the vaccine to be both safe and effective. The typical vaccine components mentioned in the first section – the pathogen, the tissues in which it is cultured, an adjuvant to help stimulate immunity, and a preservative to protect the vaccine from additional pathogens – are each capable of causing unwanted side effects.Live viruses and bacteria, found in the DPT and MMR vaccines among others, are better able to stimulate immunity, but are more likely than weak or killed pathogens to cause a persistent infection and excessive inflammation, including inflammation of the brain (encephalitis) and subsequent brain damage.  Animal or human tissues in which pathogens are cultured contain proteins similar to those contained in our own tissues.  In reacting to the pathogen in a vaccine, some immune systems may see these proteins as part of the threat, and produce autoantibodies against them. These autoantibodies can’t tell the difference between the injected proteins and body’s proteins, resulting in chronic inflammatory autoimmune disease such as Guillain-Barre syndrome, arthritis or multiple sclerosis. The most typical adjuvant in vaccines, aluminum, is a metal that has been linked to Alzheimer’s disease, dementia and brain damage, and it may be difficult for some children to detoxify. As for preservatives, some vaccines contain formaldehyde, a carcinogen, and most vaccines previously contained thiomersal, a form of mercury, before vaccine manufacturers agreed to provide mercury-free vaccines upon request several years ago. Could these ingredients, as they are injected into children with greater frequency and in greater quantities, be responsible for the increasing incidence of chronic immune hypersensivity and developmental disorders in children?  Clearly, not all children have negative long-term reactions to vaccines; in fact, it seems that most of them don’t.  But might some children have a genetic susceptibility to having adverse reactions to vaccines, particularly when administered according to the current schedule?

What are the facts of the situation? First, vaccines carry the potential for adverse effects, including brain damage.  Second, there is a parallel between increasing autism rates and the increased number of vaccines given.  Last, autism typically emerges in children during the period of time when vaccines are administered.  What have we proved?  Nothing.  These facts are not proof of a causal relationship between vaccines and autism–they only show a correlation.  However, this correlation makes a causal relationship a possibility worth investigating, especially since no other cause of autism has been identified. Accordingly, many scientific studies have been done on whether a link between vaccines and autism exists. The initial safety studies done on each new vaccine by Merck, Sanofi Pasteur, Wyeth, and GlaxoSmithKline (the four large pharmaceutical companies that manufacture almost all vaccines), the results of which are reviewed by the FDA and the CDC’s Vaccine Adverse Events Reporting System (VAERS), have not found a link for any individual vaccine. Doctors and research scientists, most notably the independent, non-profit Institute of Medicine, have conducted many additional studies over the past two decades, as well as comprehensive reviews of earlier research, and the vast majority of them have also concluded that no link can be proven, thus confirming the scientific consensus that the serious side effects of vaccines are extremely rare and do not include autism.

The most famous study that did hint at a possible connection between vaccines and autism was published in 1998 in The Lancet, a British medical journal that is perhaps the most respected in the world. The lead author, Dr. Andrew Wakefield, and twelve of his colleagues, argued, based on observations of twelve children with both inflammatory bowel disease and autism, that the children might have a new syndrome caused by the vaccine-strain measles virus, which was found in their intestines. Because the children were previously normal, Dr. Wakefield suggested an environmental trigger might be the cause of the syndrome, and called for the MMR vaccine (measles-mumps-rubella combination) to be discontinued in favor of separate vaccines administered at separate times, until more research could be done. However, the British government felt that to do so would increase the exposure of children to the three diseases. The results of the study were widely reported in the news media, and with MMR remaining the only vaccine available, many parents did not vaccinate their children against the diseases at all.

In the years that followed, both Wakefield and the study received increasing criticism. Other scientists did similar studies and reported that they had failed to duplicate the results. A journalist investigating Wakefield found that he had ties to a lawyer preparing a lawsuit against the MMR manufacturers, and that he had a patent on a new measles vaccine, both indicative of serious conflicts of interest. Ten of the twelve co-authors eventually disowned the paper. Earlier this year, The Lancet itself finally retracted the paper, and Dr. Wakefield lost his license to practice medicine in the UK.

In light of this evidence, it would seem that the possibility of any link between vaccines and autism has been thoroughly eliminated. But for a variety of reasons, we must question the credibility of those who signed off on vaccine safety, who authored and reviewed pro-vaccine studies, and who have promoted vaccines in the media. To begin with, the general public has long had good reason to distrust the ethics and integrity of the pharmaceutical industry, which has been known to disguise or minimize knowledge of adverse reactions to its products (such as Avandia, Vioxx and Fen-Phen). It has also been known to aggressively market its products to as wide a customer base as possible — even urging in recent months, with governmental approval, cholesterol-lowering drugs on people who do not even have high cholesterol. Vaccines are a guaranteed lucrative investment, given that they are prescribed equally to almost every individual in the country.

An additional strike against the pharmaceutical companies’ assurances of safety is that they are not responsible for adverse side effects of the vaccines they manufacture. In the 1980s, as more parents whose children had been injured by the DPT vaccine began to bring lawsuits against vaccine manufacturers, those manufacturers threatened to stop making vaccines entirely, reasoning that it would be unprofitable to continue if they had to pay expensive personal injury claims. In order to ensure that vaccines remained available to the public, the U.S. government stepped in and passed the National Childhood Vaccine Injury Act, which set up a special government court for hearing vaccine injury claims, and awarding damages up to $250,000.  The damages are funded by proceeds from a tax on vaccines, thus shielding vaccine manufacturers from any financial liability. Claims are argued before a government-appointed judge rather than a jury, and while most claims are rejected, the court has had to award almost $2 billion in damages since its inception.

Clearly, pharmaceutical companies manufacture vaccines for profit, not out of an overriding concern for the safety of children. It is not likely that they would abandon profitable products such as vaccines even if they knew that such products caused relatively frequent and severe side effects–just as they knew, but kept secret, the fact that Avandia increased the risk of heart attacks, for example. It is therefore prudent not to accept at face value claims (and by claims, I mean advertising) by the vaccine manufacturers, and by the scientists whom they employ, that vaccines are extremely safe.

What about the government’s independent oversight and regulatory authority? Unfortunately, as in so many industries (including banking, energy, and health care) a revolving door of employment exists between the pharmaceutical companies and the federal authorities that regulate them. An example is Dr. Julie Gerberding, who directed the CDC from 1998 to 2009. This was the period during which the number of vaccines administered and the number of autism cases greatly increased. Dr. Gerberding waited exactly one year and one day after leaving the CDC – the legal minimum – before taking on the job of President of the Vaccine Division of Merck Pharmaceuticals. Gerberding, during her CDC tenure, heavily promoted Merck’s new-to-the-market HPV vaccine, Gardasil, as well as the safety and effectiveness of vaccines in general.

As for scientists and medical doctors who conduct research on the safety of vaccines, many rely on the financial support of the pharmaceutical companies to carry out their research.  Without that support, they would be unable to carry out wide-ranging, long-lasting epidemiological studies of vaccine reactions. The most vocal and media-friendly proponent of vaccine safety, Dr. Paul Offit of the Children’s Hospital in Philadelphia, happens to be the co-inventor of the Rotavirus vaccine RotaTeq (also manufactured by Merck).   Offit has received royalties totaling $182 million from RotaTeq alone.

The conflicts of interest described so far have their origin in greed, but some conflicts can arise from humanitarian motivations. Most public health officials have concerns that if doubts about vaccine safety are given a more thorough hearing, a majority of parents might choose to vaccinate their children less, or not at all (as we saw happen in the aftermath of the Wakefield study publication) and consequently return us to an era of epidemic disease rivaling that of the 19th and early 20th centuries. The authorities may be unwilling to give a fair hearing to the possibility of a vaccine-autism link to serve the greater good. It’s possible that, even if Dr. Wakefield was partly right in his conclusions, the government and scientific community may have been driven by these types of fears to dissect his work for errors and conflicts and to magnify those flaws.

With so many powerful institutions – pharmaceutical companies, government, and scientific bodies – motivated for a variety of reasons to disprove a link between vaccines and autism, it is unlikely that any individual scientist or pediatrician is willing to stake their reputation, potentially even their license to practice medicine, by publishing (or even conducting) a study indicating greater-than-reported side effects of vaccines.  Not only would funding for such a study be difficult to obtain, any flaws in its methodology will be far more heavily scrutinized than if it were to confirm what has already been promoted as scientific truth.

If so many conflicts of interest are at work, shouldn’t we expect to see weaknesses in the pro-vaccine studies?  In fact, on closer examination, many of the studies showing that vaccines are unrelated to autism have significant methodological flaws or are reported to have broader conclusions than they really do. To take a recent example, an epidemiological study by researchers from the University of Louisville School of Medicine was published in Pediatrics magazine on May 24th of this year, stating that giving children vaccines on schedule had no negative effect on long-term neurodevelopment. Most news outlets reported that the study had shattered the “myth” that a delayed or alternative vaccine schedule was safer than the standard, CDC-recommended schedule. However, the study was based on data from a 2007 study published in the New England Journal of Medicine intended to determine whether increased amounts of thiomersal in vaccines caused greater numbers of neuropsychological disorders. That study contained a disclaimer noting that children with autism spectrum disorders were specifically excluded from the data set. Consequently, such children were not examined in the recent study either, and the authors acknowledged that they were restricted in their ability to assess outcomes such as neuro-developmental delay, autism, and autoimmune disorders. The differences between the two groups that were compared were also not significant. Those who were placed in the “timely” group received the recommended 10 vaccines in their first seven months while the “untimely” group received an average of 8. The untimely group, though their shots were delayed, did not actually receive fewer vaccines at each doctor visit, and the study indicates that they may have missed vaccines for socioeconomic reasons rather than intentionally abiding by a different schedule. Finally, the study was only of children receiving shots from 1993 to 1997, the period just prior to that in which the number of vaccine shots increased dramatically.

While, as stated above, these types of omissions and flaws are characteristic of most of the pro-vaccine studies, the fact that Dr. Wakefield’s study has been discredited as well is not necessarily comforting for those wanting to be reassured about the safety of vaccines, as it indicates that his conflicts of interest, as well as an error-filled study, somehow escaped the notice both of the editors of the Lancet and of the dozen co-authors who participated in the research. It must be concluded that we cannot simply take for granted the results of scientific studies from even the best medical journals, having seen what happens when they are subjected to intense scrutiny.  And, above all, we must keep in mind that such scrutiny is not likely to be applied to studies that confirm the scientific consensus on vaccines.

To better determine whether a connection might exist between vaccines and autism, we would need a long-term study comparing the health problems of a control group of completely unvaccinated infants against another group that has the standard vaccine schedule, and possibly additional groups that follow selective or alternative vaccine schedules. No study of this type has yet been done.  Pro-vaccine groups argue that such a study would be unethical, assuming ahead of time that vaccines are safer than the alternative, though this is what the study would be meant to determine.  Though such a study would be expensive, anti-vaccine groups might be able to fund it, were it not for the fact that, having staked their reputations on a link between vaccines and autism, they could not be considered an objective sponsor. Perhaps the main obstacle, however, is that a study of this type would require a large number of children to go unvaccinated and potentially susceptible to disease, and no public or private institution would want to take responsibility and liability for these potential adverse effects. Of course, autism is itself an epidemic that must be addressed, but as long as its cause remains unknown, no institution is officially liable for it.  Only the families of autistic children bear the burden for it.

As the controversy rages on, fewer parents are taking the medical establishment (including the CDC) at its word.  On May 5th, 2010, the CDC announced the results of a study they had conducted on parental compliance with the current recommended vaccine schedule. The percentage of parents who refused or delayed at least one vaccine for their children had increased from 22% in 2003 to 39% in 2008. Why? The parents cited concerns about the safety of vaccines, particularly the risk of autism. If the risks of vaccines are in fact much greater than reported, these parents seem to be making the right choice. However, one must not forget the reason why we vaccinate in the first place: to protect our children from infectious diseases. Eliminating one of the possible causes of autism from your child’s life won’t do them any good if they suffer permanent damage or death from polio, measles, diphtheria, tetanus or meningitis. Therefore, suspecting that the side effects of vaccines may be greater than reported leaves us with no easy decision to make. The overarching question that remains is the same that has pursued us throughout human history: how do we safely protect our children from disease?

 

We’ll take a stab at answering that question in next week’s newsletter, “Building Immunity.”

 

Vaccination: An Overview (Parts 1 and 2)

1. How Vaccines Work

 

We live in a world permeated by microorganisms of all kinds – bacteria, fungi, even microscopic animals and plants. Microorganisms interact with human beings in a number of different ways, in many cases seeking us out as their hosts for mutual benefit. Probiotics, for example, are various species of bacteria that live in our intestines, helping us digest our food and absorb nutrients. But some viral and bacterial microorganisms, known as pathogens or germs, cause disease and death in their human hosts rather than coexisting in a mutually beneficial relationship. Vaccination is meant to be a way of protecting us from these pathogens.

Generally speaking, a vaccine is a biological solution, prepared in a laboratory, that contains a weakened or killed virus or bacteria. A person who receives a dose of a vaccine containing a microorganism becomes immune to the disease caused by that microorganism. For example, the measles vaccine grants immunity to the measles virus and thereby to the disease the virus causes. The vaccine accomplishes this by taking advantage of the amazing immune system that exists in the human body.

The immune system is a network of biological processes that combine to protect us from infectious agents such as the pathogens mentioned above. Components of the immune system include physical barriers like skin and mucus but also interior protective agents such as white blood cells and interferons (proteins that protect us from viruses). Our most complex and advanced form of immunity, known as adaptive immunity, involves antibodies (aka immunoglobulins). Antibodies are specific proteins that the immune system produces upon encountering a foreign substance such as a microbe (aka an antigen). An antibody enables the body to more quickly recognize and neutralize the antigen to which it corresponds. As a result, after just one encounter with a pathogen, we can become permanently immune to it upon any future encounters. In other words, due to our ability to produce antibodies, we are able to adapt to an attack such that the same attack won’t work on us twice.

When we are injected with a dose of a vaccine containing a weakened or killed virus or bacteria, the immune system kicks into gear and fights off the pathogen, at the same time producing antibodies against it. Ideally, the pathogen will be weak enough to pose no danger to the body, but strong enough to still stimulate antibody formation. That way, if we encounter the pathogen in the future, we’ll have the antibodies ready to fight it off regardless of its strength. In other words, we’ll be immune to it.

Most vaccines contain, in addition to the pathogen, the following ingredients: animal or human tissues, which serve as a medium in which the pathogen can be cultured; a preservative (such as thiomersal, a mercury-containing compound, or formaldehyde) to keep other pathogens from contaminating the vaccine; a stabilizer such as MSG, to prevent the vaccine from being damaged by heat, light, acidity or humidity; and an “adjuvant,” usually aluminum, which is a substance that increases the response of the immune system. These ingredients, which differ depending on the vaccines, are the result of many decades of research on how to make vaccines safe, effective, and cost-effective.

The most crucial balance to strike in making a vaccine is between a too-strong pathogen and a too-weak one. In the former case, the pathogen may overwhelm the recipient’s immune system, resulting in disease; in the latter case, the pathogen may not stimulate lasting immunity. For example, the oral polio vaccine, which used a live polio virus administered in a similar manner to the way the actual polio virus is contracted, actually caused polio and subsequent paralysis in a small number of children each year before it was discontinued in the early 2000s. For this reason many vaccines are injected, entering the body via the bloodstream, and feature weakened or killed pathogens, relying partially on the afore-mentioned adjuvants for additional stimulation of the immune system. However, this method, presumably because it bypasses certain aspects of the immune system, sometimes does not result in lasting antibody production, in which case it does not confer permanent, lifelong immunity in the subject (hence the need for recurrent “booster shots” of certain vaccines ). In contrast, immunity from a naturally contracted infection is more likely to be permanent, but the risk of serious disease is much greater when acquiring immunity in this way. This dilemma of safety versus effectiveness, of stimulating immunity without harming the patient, has been present since the earliest and most rudimentary attempts at vaccination.

 

2. The History of Vaccination

 

Observing the progress of the Plague of Athens in 430 BC, the Greek historian Thucydides wrote that the plague (now thought to be typhus) “never took any man the second time so as to be mortal.” Those who got sick but survived did not have to fear dying from the disease later on. Similar observations of adaptive immunity may have been what led seventh century Buddhist monks to adopt the practice of drinking a small amount of snake venom to make them immune to the poison from an actual bite. In ancient China, the most threatening disease was smallpox, and by the 10th century one Buddhist nun had found a method for treating smallpox with inoculation. Inoculation, a more general term than vaccination, is the placement of something into a medium in which it can grow and reproduce, such as a plant part grafted on to another plant, or an antigen into a human body. Inoculation with smallpox for immunization purposes is known as variolation.  Over the next few centuries, variolation became common practice in China as a means of providing some protection against smallpox.

Ancient Chinese methods of variolation generally consisted in drying and pulverizing smallpox scabs from people with mild cases of smallpox and blowing the scab powder into the nostrils of healthy people. The mild cases were chosen for the same reason that vaccine makers now often use weakened or killed pathogens: to reduce the risk of inducing a serious infection. Another form of variolation was to have healthy children wear the undergarments of infected children for several days – a tactic similar to the chickenpox playdates of the 20thcentury, prior to the invention of the chickenpox vaccine.

Similar forms of variolation were eventually practiced in India, Byzantium and the Middle East. Due to various causes including the Crusades, the slave trade, and other forms of trade, smallpox spread to Europe and the Americas, and variolation followed.  Variolation techniques now included applying smallpox scab powder to cuts or scratches on the skin, and the process was slowly accepted in the West as a preventative against the disease, though many distrusted it based on its Oriental origin. The major drawback of variolation, however, was that people occasionally developed serious cases of smallpox from the procedure, and either died or suffered scarring and blindness. People sometimes feared the preventative almost as much as the disease itself.

In the eighteenth century, smallpox was widespread throughout England, but one group of people were curiously unaffected by the disease: dairy workers. Through their contact with cows, dairy workers typically became infected with cowpox, a disease similar to smallpox but much less dangerous, which was spread by touch from the infected udders of cows to humans. Cowpox was similar enough to smallpox that the antibodies produced by the infected workers could fight off smallpox microbes as well as cowpox microbes. One of the first people who took advantage of this phenomenon to deliberately induce immunity was an English dairy farmer, Benjamin Jesty.  In the year 1774, during a local smallpox epidemic, Jesty infected his family with the cowpox virus that had already infected his servants and workers. The family easily recovered from the cowpox virus and were untouched by smallpox.

Other farmers carried out similar experiments with success. Eventually, word of this immunization method reached the surgeon and scientist Edward Jenner, who in 1796 decided to test it out by inoculating his gardener’s eight-year-old son with pus from a milkmaid’s cowpox blisters, and then deliberately injecting him with smallpox (scientists had a little more leeway to experiment freely back then).  Since the smallpox virus did not appear to affect the boy, Jenner announced that he had been successfully “vaccinated,” deriving the term fromvacca, Latin for “cow.” Jenner continued to test vaccination on dozens of additional subjects with immediate success, and thanks to his connections in scientific and government circles, was able to widely publicize his findings. He also founded an institution to promote his method, and the British government soon banned variolation in favor of vaccination.

Over the course of the 19th century, vaccination against smallpox became standard practice in most European countries, and was in some cases mandatory. However, smallpox epidemics continued, particularly during times of stress and upheaval.  During the Franco-Prussian war of 1870-72, a smallpox epidemic struck France and Germany and killed over 100,000 people. Jenner himself became aware that both the safety and the effectiveness of the smallpox vaccine were less than ideal. He had discovered that a significant number of people still developed smallpox even after vaccination. They also sometimes became infected with other diseases that had contaminated the vaccine. As for the immunity from vaccination, it generally only lasted 3-5 years and then began to decline.

What Jenner did not know was the nature of smallpox and how it was transmitted. Only by the end of the 19th century did scientists investigating both smallpox and the many other infectious diseases that were prevalent at the time (tuberculosis, diphtheria, cholera and typhus, among others) come up with the famous germ theory of disease. The germ theory stated that each individual infectious disease was caused by an individual, microscopic, living organism. The noted French chemist Louis Pasteur was a major contributor to the theory, having proven that microscopic organisms, good and bad, do not generate spontaneously but reproduce by subsisting on nutrients, and can be airborne or anaerobic. Pasteur subsequently put his discoveries to use in developing pasteurization, the method of heating liquids to kill most microorganisms present within them.

 

The germ theory of disease enabled scientists to more easily develop vaccines against infectious diseases besides smallpox. Pasteur himself worked on vaccines against rabies and anthrax. Aided by his expertise in microbiology, he discovered methods for attenuating (weakening) bacteria in vaccines so that the vaccines could confer immunity with less risk of actually causing disease.  In the following decades, scientists further refined and improved the techniques of vaccine development, introducing vaccines for diphtheria, tetanus, and whooping cough prior to World War II. A polio vaccine was developed during the early 1950s. Since then, vaccines have been developed for many other infectious diseases: measles, mumps, rubella, hepatitis A and B, meningitis, chickenpox, flu and most recently HPV and rotavirus. Today, each disease against which we routinely vaccinate has a small or nonexistent incidence in the developed world. If the 19th century was the Age of Infectious Disease, the 20th century was the Age of the Vaccine.

 

Modern Day Malnutrition: Anemia

In a country as wealthy as the United States, with food so abundant and affordable, it seems strange that anyone could suffer from malnutrition. And yet, not only is malnutrition a common occurrence, even the most well–off of our citizens are susceptible to it. The same goes for other developed nations. But it’s not happening because we’re not getting enough food. Developed countries rarely, if ever, have famines and food shortages. Rather, it’s the nature of our food that is causing this problem. Thanks to modern food processing methods, developed countries produce a plentiful supply of food that is high in calories—sugar, white flour, corn syrup, and animal products from animals fattened up on soybeans and corn. While in centuries past, many people died for want of calories, we have more than we could ever eat, and at an affordable price. Unfortunately, those same modern processing methods, though they give us cheap calories, eliminate much of the nutrition from foods. Nutrients are just as important for survival as calories, so with too much of the latter and not enough of the former, it’s easy to end up both overweight and undernourished. You can be eating too much and not enough at the same time! It doesn’t help that, thanks to the structure of our society, high–calorie/low–nutrient foods are the cheapest and the most convenient.

Anemia is a good example of the malnutrition that runs rampant despite the prosperity of our country. Anemia is a blood disorder with symptoms including fatigue, pallor, depression, headaches, lower back pain, dizziness, easy bruising and slow healing, loss of sex drive, brittle nails, hair loss, thin and dry hair, dry skin, and, in extreme cases, shortness of breath and palpitations. The disease is most commonly caused by a lack of dietary iron, folic acid, and vitamin B12. Iron is necessary for the production of hemoglobin, a protein that makes it possible for red blood cells to carry oxygen to our tissues. Folic acid and vitamin B12 are essential nutrients for the formation of the red blood cells themselves. Though such nutrients are readily present in whole, natural foods, anemia affects an estimated 3 to 6 million Americans.

One reason why such deficiencies exist even in people who can afford whole foods is simply a lack of knowledge. Most doctors don’t receive a thorough education in nutrition, let alone the average American, and most people don’t realize that eliminating the cause of their symptoms could simply be a matter of eating better. Another reason is that our society is structured so that processed foods are cheaper and more convenient than more nutritious whole foods. Nevertheless, it would be difficult to find even one anemia sufferer who would really rather endure fatigue, depression and back pain than make some changes in diet and lifestyle that would not just eliminate those symptoms, but make for a more satisfying mealtime as well.

If you are (or think you may be) anemic, nutrient deficiency is very likely the cause. To increase your intake of the nutrients you need, try these recommendations:

–Add more leafy green vegetables to your diet. These include kale, collard greens, cabbage, bok choy, swiss chard and spinach. Leafy greens contain both iron and folic acid, as well as manganese, another important nutrient for iron absorption. They also contain chlorophyll, a nutrient similar to hemin, the pigment that forms hemoglobin when combined with protein.

–Add more iron rich red meat, such as lamb and beef, to your diet. These meats also contain vitamin B12 and the protein needed for forming hemoglobin. However, meat should be from grass–fed animals. Animals that did not eat their greens will have little iron in their own blood, and the meat from anemic animals won’t help you very much to overcome your own anemia. Especially rich in nutrients are organs such as the liver and kidneys, and since blood is formed from the bone marrow, try making a soup with beef soup bones containing marrow.

–Seafood is another good source of iron, B12 and protein, but it should be wild caught. Organic eggs and dairy products from grass–fed cows can also provide the same nutrients.

–Other foods that contain the nutrients you need: whole grains, beans, nuts, dried fruit, and especially sea vegetables such as nori and kombu.

Whether you’re anemic or not, eating more of these foods will without a doubt increase your energy and improve your mood, and since they contain such a wide variety of nutrients, they will address other types of deficiencies as well. So give it a try, and email me with any questions!

How To Have Beautiful Clear Skin…Indirectly

Each year, people spend millions on products designed to improve the appearance of their skin. This is understandable, as the condition of our skin strongly influences our physical attractiveness and self confidence. However, those who focus only on how their skin looks are fighting a losing battle; it’s what is on the inside that matters, and in more than one way. To be more concerned with our physical appearance than our conduct towards others is to invite more stress into our lives, and stress contributes to acne, eczema, and other skin disorders. And to apply products to our skin to clean it up is to ignore the nutritional deficiencies and other health issues within us that are contributing to those disorders in the first place. Just as focusing on losing weight, rather than on health, will either result in failure to lose weight, or in success at the expense of health (e.g. anorexia), focusing on skin care, rather than overall health, will only result in a temporary abatement of poor skin, and a lifelong dependence on care products, rather than lifetime freedom from skin disorders.

The skin is one of our organs of elimination. When there is any excess of toxins in the body, some of them will be carried out of the body by means of sweat, acne, or skin rashes such as eczema. If you eat a diet high in processed foods and low in nutrients, and are not very physically active, your body will come to contain an excess of toxins, some of which it will attempt to remove through the skin, resulting in continual skin eruptions. Excessive hormone production (which occurs during adolescence, menstruation, and during periods of stress) also contributes to skin disorders, as the hormones produced result in clogged pores that slow the elimination of toxins. Clogged pores can harbor bacteria and become infected, further worsening the condition of the skin.

If you would like to have beautiful skin naturally, the approach is simple. Take whatever you might have been spending on skin care products, and devote it to your food budget instead. By adopting a balanced diet of whole, natural foods, you will provide your body with the nutrients it needs to detoxify quickly and easily, while reducing the number of toxins that are going into your system. Reducing stress and increasing physical activity will also speed the process.

At Live Free Nutrition we believe in subtraction by addition, so here are some tips for what you can add into your life to help improve the health (and consequently the appearance) of your skin:

–Eat more foods that are full of nutrients and aid in the process of detoxification: leafy green vegetables (especially cabbage, and the broth made from boiling cabbage), cucumbers, carrots, squash, pumpkin, celery, onions, garlic, sea vegetables, whole grains (especially brown rice and millet), sprouts, and any and all fruit.

–Eat more good quality fat, particularly organic butter, chicken skin from healthy chickens, raw milk and cream, avocados, olives and their oil, eggs with deep yellow yolks, and coconut oil. Skin is mostly made from fat, and fat is necessary for you to digest fat–soluble vitamins A,E, and K, which are essential for skin that is not just blemish–free, but also vibrant and glowing. Eating more good quality fat will help you avoid poor quality rancid fat from processed foods, which contains free radicals that contribute to wrinkles and the general breakdown of skin cells.

–After introducing healthier foods, you will experience a brief increase in skin disorders as your body takes advantage of the added nutrients to thoroughly detoxify. To get this stage over with quickly, apply tea tree oil (a natural antiseptic) to inflamed, infected areas of the skin, and powdered French green clay (mixed with water and daubed on the affected area) to acne in general, as it will draw toxins out more quickly. After the initial detoxification, if you maintain a healthy lifestyle, you will rarely need these products. Some other recommendations:

–Brushing your body in the shower with a stiff skin brush can help the elimination–action of the skin.

–Generally trading in all conventional skin care products, soaps, and shampoos for organic ones or ones without any artificial or chemical ingredients will cut down on toxins, and will probably also eliminate rashes and many other skin problems.

–Ocean bathing, if you can get it, is very soothing to the skin.

–If you can’t find tea tree oil, lemon juice is also a natural antiseptic, and less expensive.

Eat More…Weigh Less

If there is a Holy Grail of dieting, it’s any technique that would make it possible for us to eat as much as we want without gaining weight. Anyone reviewing the most popular diets of the last few decades will see that almost all claim to have found such a technique or strategy, and to be able to deliver miraculous weight–loss results. And while weight loss is all well and good, the real appeal of such strategies is the promise that we won’t have to starve ourselves to obtain the weight loss. You don’t see many diet books out there that focus purely on shedding pounds. “Chapter 1. Eat less.” No, that wouldn’t really fly. The truly crucial section of any diet book is the part where it tells you how you can lose weight without actually dieting.

The reason why just eating less is so hard was addressed in last week’s newsletter on cravings. We eat because the food we crave is either supplying a real need, or it’s making our bodies think that it is supplying one. We already know that just controlling our cravings and eating less is extremely difficult and involves ignoring all of the body’s messages. So diets of all kinds make the promise to us that we can indulge and still lose weight. Without that promise, the diet would not have much appeal.

The irony, however, is that most diets that make this promise are already planning to break it. An Atkins–type diet promises that we can indulge in fat– and protein–rich foods, but limits carbohydrates so much that our bodies may go through ketosis, a type of fat–burning process that isn’t supposed to take place unless you are truly starving—and which can make you binge on carbs like crazy. The old high–carb diets told us that while we couldn’t eat fat, we could happily indulge ourselves on carbohydrates, and without fat to make the diet more filling, people ended up being hungry all the time even after eating way more carbs than they could burn. Other diets rely on artificial sweeteners or other artificial starches, as well as fiber and textured protein, to make foods seem sweet and filling but without providing any real nutrients, ultimately leaving their adherents malnourished. The natural consequence of following one of these deprivation diets—all of which advertise themselves as satisfying—is that while we lose weight (because we are in one way or another eating less) we still have uncontrollable cravings. After a few months, the diet becomes unsustainable, we stop trying, and we gain the weight back.

What many people do not realize is that our “fallback” diet—the Standard American Diet (SAD) in which we eat all we want and continue to gain weight—is in itself a type of deprivation diet. Because the diet does not contain enough nutrition, such as the vitamins, minerals and other nutrients contained in fresh fruits and vegetables, complex carbs, and natural fats, people who are on SAD eat constantly but are never satisfied. The human body doesn’t know how to ask for B vitamins, retinol, or magnesium, for example—but it expects to find those nutrients in sweet foods, fatty foods, or salty foods. So that’s what we end up craving, and if we go for processed foods with those flavors, we don’t actually get the nutrients, just the calories. Consequently, even though we have more calories than we could use, the cravings come right back.

hen it comes to losing weight, is “eat less” the answer? Absolutely not. It’s true that if we starve ourselves, we lose weight. But starving oneself is very unhealthy, not to mention ineffective in the long term (to put it mildly). The good news is that there’s a way for the very act of indulgence itself to be a factor in achieving a healthy weight.

If you are eating a balanced diet of whole foods, you will naturally approach your own personal healthy weight (faster or slower depending on whether you can also include physical activity in your schedule). Whole foods have just the right balance of calories and nutrition, so you only crave as much of them as you really need. In fact, once your body is used to properly cooked natural, whole foods, it will recognize their value and prefer them to junk food. The problem is that if your body isn’t familiar with these foods, it won’t naturally crave them. So what’s the solution? Like the title of the article says: eat more…weigh less. If you want a weight loss diet that actually works, it’s pretty simple: add healthy foods.

Let’s say you have chronic sugar cravings, and that you snack on cookies in between meals (like I tend to do when I’m in a state of imbalance), while at the same time, because you’re trying to lose weight, you’ve reduced the size of your meals such that you’re eating small salads for lunch. When you’re following the Live Free Nutrition Weiner Diet Plan, you’ll allow yourself to snack on all the cookies you want, but each time you have a cookie snack break, you’ll first have a glass of water, or a handful of cooked greens, or a piece of fruit, etc. When you get to lunch or dinner, and you’re actually having some homemade and healthy food, eat all you want (don’t forget to include plenty of healthy fat). Then, go for the dessert without guilt. Try this a few times, and suddenly you will find that you’re not quite as much interested in your snacks, or your dessert, even if you still eat them for a little while out of habit. Your body is getting what it actually needs first—and suddenly you are finding your cravings diminish without having had to control them at all. That’s right, you don’t have to restrict your diet one bit!

The biggest challenge in approaching weight loss this way is psychological. Because it’s a bit of a paradigm shift, it requires a change in your thinking. You may have been telling yourself for years that you just have to stop eating so much, while at the same time having such strong cravings that you can’t help yourself. Now you will be telling yourself that you need to try and eat more, while feeling full all the time. But even if thinking differently is a challenge, losing excess weight with this diet is not—and that’s as it should be. We were never meant to constantly starve and deprive ourselves just to be healthy. A healthy, fit person is a person who is satisfied and contented with their diet—who enjoys eating and still feels good 30 minutes (or even three hours) later. It all starts with eating more healthy foods, rather than trying to cut back on the junk food; after that, just relax and trust your body. As they always say, “You’ll be amazed by the results!”

How to Control Your Cravings

Did I get your attention with the title of this article? Who doesn’t have at least a few food cravings they wish they could control? I’m afraid, though, that my title is nothing more than an attention–getter, because I’m not actually a believer in controlling cravings. Food cravings do not arise spontaneously, and they are not just a product of your genes. They arise from your body’s deep–seated desire for nourishment. Whether your particular craving is for a specific flavor of ice cream, Coke or Pepsi, potato chips, M&Ms, white–flour pasta, coffee, or any of the other usual suspects, that craving is actually a sign of your body crying out for some type of nutrition. That’s why controlling your cravings doesn’t work. Even though we know on an intellectual level that junk foods are not good for us, those foods have been designed to appeal to the body’s desire for nutrition and balance. Our bodies crave salty foods like French fries because the body thinks saltiness is an indicator of high levels of essential minerals. We like sodas with high amounts of caffeine because they make us feel detoxified and re–energized. In other words, you have these strong cravings for junk food precisely because your body wants so badly to be healthy. While your mind may be saying “I know that’s not good for me,” your body is responding “Are you nuts? Eat that or else! We need it to survive!”

While it may be technically possible to control your cravings for a limited time through sheer will power, the only effective, long–term solution is to meet your body’s needs with foods that are truly nourishing, rather than foods that simply appear nourishing. The former bring you into a ongoing state of balance and satisfaction; the latter are satisfying for a very brief time but then leave you in a state of even greater neediness. Sometimes it’s not just nutrition that is lacking—for example, a craving for caffeine is usually a result of not getting enough sleep. A craving for sugary foods could be from a series of stressful events in your life. Just yesterday, I found myself starting to devour a bar of chocolate after a long and stressful day. However, I realized that the real problem was not the bar of chocolate, or my craving for it, but that at that moment I was unwilling to focus my attention on resolving the source of stress in my life. Once I did that, my cravings vanished. And in fact, that did take a little willpower—but the key is that it was willpower applied in a productive direction.

My recommendation for you is not to control your cravings, but to analyze them. Ask yourself where this craving is coming from, and what kind of need your junk food is meeting (however temporary a solution it may be). That method will put you on the right path to the heart of the problem, instead of leaving you stuck focusing on the symptoms. Maybe your body is craving junk food because it really needs whole grains and green vegetables, but isn’t familiar enough with those foods to crave them (and believe me, once your body gets used to well–prepared brown rice, you’re likely to crave it daily). Maybe you’re just looking for a physical sensation to block out the pain from some frustrating events in your life, and it’s really those events that need to be attended to. Once you have taken some steps towards understanding your situation, rather than simply feeling guilty, you’ll find that it’s a lot easier to “control” those cravings than you would ever have believed.

Seasonal Allergies

Allergic reactions in spring are very common, and it’s no wonder. Since we spend less time outdoors, our immune systems are less accustomed to foreign substances, even natural ones. Other factors like long–distance traveling and invasive species continually expose us to new particles our bodies are unfamiliar with. When spring comes and the air fills with pollen, our immune systems overreact and we develop rashes, itchy, watery eyes, other forms of inflammation, sneezing, mucus, fatigue, etc. Fortunately, there is a natural, healthy way to deal with this problem: all you need to do is get your body used to the pollen by vaccinating yourself against it. The perfect vaccine is found in raw honey made by local bees. Processed honey won’t work, because all the pollen particles (along with digestive enzymes and other good stuff) has been removed in the processing. Raw honey has a somewhat stiff texture, like peanut butter, and some brands leave a layer of pollen and honeycomb on top of the honey that’s extra effective for reducing allergic reactions.

A few years ago I started eating “Really Raw Honey,” the brand that’s produced nearest to me, in Baltimore, Maryland. Each spring I ate about a pound of the honey, as soon as my allergies started, sometimes more if necessary, until they went away. Each year I’ve needed to eat less, and this year I haven’t needed it at all—I’ve had no symptoms. This is a really inexpensive, tasty, and permanent way to cure your allergies, so give it a try. The honey has many other health benefits as well. Look in your local health food store to see what brand is produced nearest you, and give it a try!

Healing Heart Disease

I. Introduction

Heart disease is the leading cause of death for both women and men in the United States. In 2005, 27.1% of all U.S. deaths were from heart disease, and 68.3% of those were from coronary artery disease. More people died from heart disease than from strokes, respiratory diseases, diabetes, flu, pneumonia, Alzheimer’s, kidney disease, blood poisoning, and accidental causes (including car accidents) combined. Health care services, medications, lost productivity, and other costs of heart disease are projected to equal more than $304 billion in 2009.

Clearly, heart disease is a major problem in our country (and in many countries around the world, particularly developing countries). The mainstream medical community’s approach to reducing heart disease risk depends primarily on prescribing drugs to lower blood pressure, cholesterol, and triglycerides. Your dependence on these drugs is expected to be lifelong, even though they have side effects such as fatigue, dizziness, cough, frequent urination, impotence, heart arrhythmia, and muscle pain. According to mainstream medicine, you can also reduce your risk by following a diet low in saturated fat, cholesterol, and sodium, but most people struggle to follow this diet, and even if they are successful, they don’t necessarily see reduced cholesterol, blood pressure, or triglycerides. The failure of these dietary recommendations implies that heart disease is just genetic, and that medication, despite its side effects, is the only way to “fix” the malfunctioning body and thereby reduce heart attack risk.

However, I believe that the real reason why the low–fat, low–cholesterol diet doesn’t work is that it is based on flawed reasoning and on poor analysis of dietary studies. A truly healthy and balanced diet, can, in fact, be so powerful in protecting you from heart disease that, regardless of your genetics, you probably do not need the medications at all. In what follows, I’ll explain the mechanism of heart disease such that you can clearly see why a healthy diet and lifestyle makes a difference.

II. Heart Disease Pathology

The term “heart disease” covers a wide variety of heart–related health problems, the most common of which is coronary artery disease, or CAD. I’m going to focus on CAD in this article, although other heart health issues such as heart failure, ischemic heart disease, etc., respond equally well to the same diet and lifestyle changes.

CAD is a condition in which the flow of blood to the heart muscle through the coronary arteries is blocked by plaques that have accumulated over time in the arteries. The plaque is made up of cells or cell debris, cholesterol, triglycerides (fatty acids), calcium, and connective tissue. Almost everyone has some plaque built up in their arteries, and the plaques are present not just in the coronary arteries, but also in larger arteries like the aorta and the pulmonary artery, and in arteries that bring blood to the brain (which, if blocked, lead to a stroke). Over the years, the plaques grow in size and number, until finally they are large enough to block blood flow altogether. Blood flow can also be blocked by ruptured plaques getting wedged in between other plaques. The narrow width of the coronary arteries explains why they tend to be blocked before any others. Lacking sufficient oxygen from blood, the heart muscle breaks down, and the heart stops beating unless flow is quickly restored.

Clearly, reducing CAD risk is a matter of reducing arterial plaque, which is mostly made up of cholesterol and triglycerides. Research has confirmed that a person with high levels of cholesterol and triglycerides in the blood is much more likely to develop CAD. Also important is lowering blood pressure, which, if too high, increases the likelihood that a plaque will rupture and block the artery entirely. As stated above, most people try a combination of diet and medication to address these three main “risk factors”—high triglycerides, high cholesterol, and high blood pressure—but the standard recommended diet does not make much of a difference, and the medications don’t bring about a permanent cure. We’ll look at each of these risk factors one by one, exploring their relation to CAD, and clarifying why the correct diet will eliminate the risk factor at its source.

III. High Triglycerides

A triglyceride consists of three fat molecules (aka fatty acids) joined to a glycerol molecule. Most of the fat that we eat (animal or vegetable) is in the form of triglycerides. When triglycerides enter the body, they are either metabolized for energy or for transporting fat–soluble vitamins, or they are stored as “body fat.” Believe it or not, this latter occurrence is actually quite uncommon. Because of its density and complexity, fat digests slowly, which makes it very filling. We can only eat so much of it at once, and that limits the quantity of calories from fat that we can consume. It’s very hard to eat so much fat that you would have to store some of it as body fat, or as extra triglycerides floating around in your blood. If you have high triglycerides, or are overweight, those extra triglycerides probably came from refined carbs: white flour, white rice, sugar, and high fructose corn syrup.

Refined carbs are not as calorie dense as fat, but they digest much more quickly. Soon after you eat a cookie, drink a soda, or work your way through a bowl of pasta, you’re hungry again. Since you can keep eating carbs without getting satisfied, you can eat many more calories from carbs than you can from fat. When the carbs are digested, they pass into the blood as blood sugar. High blood sugar levels are good when you’re about to exercise, but when you’re not, they are dangerous to the health of your body. Fortunately, your body can produce insulin to remove the excess sugar from the blood and take it to the liver, which converts it into triglycerides. These are the triglycerides that are stored as body fat and/or circulate in the bloodstream, depending on your genetics (of course, if you eat enough excess calories from carbs, you’ll have triglycerides everywhere). Refined carbs are even dangerous before they get turned into triglycerides, because high blood sugar increases blood clotting (just think of how sticky sugar syrup is).

More and more studies are pointing to a link between diabetes and heart disease, and it’s not hard to see why: constant insulin production for lowering blood sugar is what exhausts the pancreas and causes diabetes. If we were to get our energy from healthy fats and complex carbs, instead of from refined carbs, we would be unable to eat excessive amounts of calories, while at the same time staying full and satisfied from meal to meal. We’d reduce pressure on the pancreas and we’d lower triglyceride levels.

However, I do need to make a distinction between different kinds of fats when it comes to preventing CAD. It’s true that saturated fats don’t cause high triglycerides, but that doesn’t mean those fats always healthy, and there’s a reason why they were linked to heart disease in the first place. It’s just not the reason everyone thinks.

Most of the saturated fat that we eat comes from animal products: beef, pork, chicken, eggs, milk, cream, butter, etc. Until the mid–twentieth century, almost all livestock was raised on small family farms, given room to roam, and fed their natural diet of grasses, insects, seeds, etc. Then, as now, we ate saturated fat from these animals, but we didn’t have the same heart disease rates that we do today. Heart disease rates only went up after our meat and dairy industry moved from family farms to factory farms, in which animals weren’t given room to move, and cows in particular were fed corn and soybeans instead of green plants.

Corn and soybeans are to cows a lot like pure sugar is to human beings. Since their digestive systems are so lengthy and complicated (they’re made to digest grass, after all), a diet of grains, even whole grains, gives them way more calories than they could use, and not enough nutrition. This is deliberate: it’s a way to fatten cows up, in the same way that people put on weight when we eat our version of refined carbs (sugar and white flour). Cows turn these carbs into triglycerides and store them as saturated body fat. At the same time, they lack the nutrition they got from grasses, so they’re unable to synthesize another kind of fat, polyunsaturated omega-3 fatty acids. Omega-3s, although they are fatty acids, actually clear out your arteries, thinning blood clots and lowering blood pressure (they’re much better for you than aspirin, too). Prior to the establishment of factory farms, animal products were one of our main sources of the omega-3s: they actually protected us from heart disease! But after we started eating meat from carb–fed, factory farmed animals, we stopped getting those omega-3s, and the heart attack rates started rising.

IV. High Cholesterol

Cholesterol is an waxy substance (an alcohol, in fact) that is manufactured by the liver. It helps form hormones, bile, and vitamin D, and provides stiffness and stability to cells. In addition to making it ourselves, we can get it from eating animal products. Cholesterol is essential to the proper functioning of the human body, but it has a very poor reputation, due to its presence in the arterial plaques that lead to heart attack. If you have high cholesterol levels, the odds are that you have a lot of plaque built up. You probably have been told that to reduce your heart attack risk, you need to lower your cholesterol levels by eating fewer foods with cholesterol and/or by taking cholesterol lowering drugs, such as statins. Twelve million Americans take statins, and many of them experience side effects such as muscle pain, weakness, and mental fatigue. Statins lower cholesterol levels, but not permanently, so people have to keep taking them all their lives.

Why is it that a substance your own body makes can be so dangerous? Remember that, in preventing heart disease, we’re trying to help the body work the way it’s supposed to. We want to help the heart keep beating. We trust the heart to beat properly if the blood flow is not blocked. Why don’t we trust the liver to manufacture the right amount of cholesterol? Why is a statin drug necessary to “fix” our cholesterol levels? In reality, it’s not: you can achieve healthy cholesterol levels without drugs. To understand why, we need to address the root cause of high cholesterol.

An arterial plaque, which contains not only cholesterol, but also triglycerides, calcium, connective tissue, and cell debris, is essentially a scab covering an area where an artery has been damaged. The source of the damage could be a micro–organism—some form of bad bacteria—but in most cases the artery wall has probably been damaged by free radicals.

Free radicals are molecules that have become unstable by undergoing a chemical reaction with oxygen, usually due to a heat catalyst. These “oxidized” molecules are highly reactive because they contain unpaired electrons. In the human body, free radicals will attack stable cells and “steal” an electron from molecules in them to stabilize themselves. The attacked molecule then becomes a free radical and attacks another molecule, and so on, ultimately disrupting the function of the cell. Your body sometimes makes free radicals as a defense mechanism against viruses and bad bacteria, but an excess of free radicals is dangerous to the body’s cells. Excessive free radicals come from stress, cigarette smoking, pollution, toxins in foods, and, most of all, rancid vegetable oils. To neutralize them, your body must rely on antioxidants, which are capable of safely “donating” an electron to the free radical. Antioxidants are found in many fresh fruits and vegetables and other natural plant foods. But if, like most people, you’re not eating enough of those foods, your body will have to manufacture antioxidants of its own. And the primary antioxidant that it manufactures is cholesterol.

Your body sends cholesterol to the damaged site via low–density lipoproteins (LDLs. LDL, the “bad” cholesterol, refers to cholesterol being carried on its way to the arteries). Cholesterol helps in repairing the arteries (giving stability to the cells, as stated above) and neutralizing the free radicals. Afterwards, high–density lipoproteins (HDL, the “good” cholesterol) carry cholesterol back to the liver. All of this is a natural and necessary process. But if you’re taking in too many free radicals, the damaged sites can’t fully heal, and cholesterol just accumulates, getting tangled up with triglycerides and also with calcium, a mineral whose health benefits have been overemphasized in order to benefit the milk industry. Calcium is an essential nutrient, but it can’t be properly absorbed unless you also have enough magnesium in your diet, and most people don’t. Excess calcium contributes to the cholesterol plaques, making them stiff and difficult to break down (hence the term, “hardening of the arteries.”).

High cholesterol levels are not the result of a genetic defect, nor are they traceable to a diet high in fat and cholesterol (except to the extent that fat from factory farmed animals contains many toxins). High cholesterol is your body’s attempt to deal with marauding free radicals, which come from poor diet and a stressful lifestyle. Artificially lowering cholesterol will not solve the problem, because you’re just fighting against your body’s efforts to protect you. Now it’s clear why cholesterol–lowering drugs must be taken continually: they only address the symptom, not the source of the problem. For real healing, you must cut down on the sources of free radicals in your life, and increase your levels of antioxidants. In doing so, you’ll find that your cholesterol levels are able to get down to normal all on their own.

V. High Blood Pressure

Blood pressure refers to the force of the blood against the blood vessels (veins and arteries). “High blood pressure,” also known as hypertension, refers to a condition in which the blood pressure is chronically elevated. There is a clear connection between high blood pressure and incidence of heart attack, stroke, and kidney disease, so lowering your blood pressure to normal levels is extremely important for your long–term health.

The standard medical view is that the cause of high blood pressure is unknown. However, studies have shown that stress, tobacco, alcohol, drugs, lack of exercise, and a high–sodium diet are all correlated to high blood pressure. In fact, blood pressure can have more than one cause, and it can cause more than one kind of damage in the body. In the case of heart disease, if blood pressure is chronically high, the blood vessels suffer increased wear and tear. They are more likely to need continual repair, which sets the stage for the formation of plaques. And if the plaques get large enough, high blood pressure can cause them to break off and clog the arteries.

If we don’t know the cause of high blood pressure, and just chalk it up to genetic factors, we’re more likely to rely on medications like beta–blockers and diuretics, even though they interfere with the body’s natural processes, and have side effects including mental fatigue, depression, impotence and nightmares. In my opinion, what causes high blood pressure isn’t all that hard to understand. Obviously, high blood pressure is a sign of increased physical tension. In many cases, that tension a physical response to mental stress. If we’re frequently angry, frustrated, anxious, or just “tense,” that’s going to cause a physical reaction that will increase our heart rate, tighten all our muscles, and keep our blood pressure high. However, we don’t have to be stressed out in order to have high blood pressure. We can also get it from a diet that’s too high in toxins, and not high enough in nutrients.

oxins are molecules that are either harmful to or unneeded by the body. We can get them from the food we eat, particularly processed conventional foods (organic, whole foods would have very few toxins) and also from our environment—if we’re breathing polluted air, for example. We definitely get them from alcohol, cigarettes and drugs, even from medications. These toxins are filtered from the blood by the liver and kidneys, and are eliminated as part of our waste. Or at least, they should be. That’s how it works normally. What’s crucial is that the liver and kidneys get enough nutrients to carry out their jobs, and that they’re not overloaded with more toxins than they can handle.

If the liver and kidneys do have a big workload, it takes longer to filter the blood, and blood flow is not as smooth and regular. If the blood flow is backed up, this increases the pressure on the blood vessels. If at the same time there’s a lack of nutrients in the diet, the liver and kidneys can’t use those nutrients to neutralize toxins, so it takes even longer to filter the blood. Retention of toxins causes more fluid retention in general, so that increases blood pressure too. Even having an imbalance of needed nutrients can be a problem, as is the case with sodium and potassium. Sodium increases blood pressure while potassium helps reduce it. We need both to have healthy blood pressure, but most people have too much sodium and not enough potassium.

There’s parallel between the two causes I’ve discussed, mental stress and toxins. In the former, we find ourselves becoming more tense and pressured (whether we like it or not) as we try to deal with life’s problems. In the latter, the stress placed on the liver and kidneys by physical toxins causes high pressure in the blood vessels. Of course, we can never completely eliminate toxins, just like we can’t eliminate the things that cause stress. But we can add into our lives things that help us deal with stress and toxins in a constructive way. Nutrient–rich foods are one example. Exercise is another—it helps with mental stress, by encouraging the production of feel–good hormones, and with toxins, by giving our body the chance to sweat them off instead of relying just on the liver and kidneys (does your sweat taste salty? There goes all that sodium!).

VI. Conclusion

Heart disease, like so many of our other modern–day health concerns, is caused primarily by stress, lack of exercise, and poor diet. Genetics may determine in what way an unhealthy lifestyle affects your body, but it does not directly cause high cholesterol, high blood pressure, or high triglycerides. These are not signs of your body’s inherent defectiveness; rather, they’re the result of your body trying to heal the damage. For long–term healing, we need to change our lifestyles so that we eliminate or negate the true cause of heart disease. To reduce your risk, try the following recommendations:

Reduce stress. There are many stressful situations in life, but they’re not worth dying for. Stress raises blood pressure and inhibits your body’s ability to burn triglycerides. It increases the production of free radicals, thereby increasing cholesterol levels. Don’t just avoid stressful situations, but try to find peace in the midst of them, so that your happiness doesn’t depend on circumstances.

Exercise more. No one looks forward to exercise if they haven’t done it in a while, or if their form of exercise is stressful. Exercise can be as simple as walking. Nothing more strenuous is needed to reduce your CAD risk. Start slowly and working up to greater distances, in peaceful settings.

Eat more complex carbs. If you want to get off sugar and white flour, the way to do it is by eating more whole grains (brown rice, whole wheat, oats, barley, quinoa) and natural sweeteners (agave nectar, maple syrup, raw honey). These “complex carbs” are just as satisfying, and they’re more filling, so you can’t eat excess calories that would be stored as triglycerides.

Eat more saturated fats from grass–fed animals. Yes, I am telling you to eat butter, cream, even red meat—the kind that contains not just saturated fat but the blood–thinning omega-3 fatty acids. Coconut oil is good too. You’ll be more full and consume fewer calories in the end. More healthy saturated fats will help you avoid the rancid vegetable oils in clear containers that contain free radicals: soybean oil, canola oil, cottonseed oil, corn oil, etc. Especially avoid hydrogenated vegetable oils (aka trans fats). For a liquid vegetable oil, choose cold–pressed, unrefined olive oil from a dark bottle. Say goodbye to cholesterol plaques!

Eat more organic/locally grown fruits and vegetables. These foods contain many antioxidants, which neutralize the free radicals from stress, toxins, and rancid oil. All vegetables contain some potassium, which lowers blood pressure. Many, especially leafy greens, contain magnesium that helps you absorb excess calcium. Finally, fruits and vegetables have the nutrients that the liver and kidneys need to detoxify the body.

Notice that these recommendations focus on the positive. I haven’t stressed cutting out smoking, alcohol, or sugar (though if you do cut back on them, that’s great!) It’s easier to add good–tasting, healthy foods into your diet than it is to just abstain from processed foods, fighting your cravings with willpower. Once you have healthy foods in your diet, you’ll begin to crave them instead, such that not only is your heart attack risk greatly reduced, you’ll still be eating what you like! It may be difficult at first to make these lifestyle changes, but there is no doubt that it’s worth the effort—for better day–to–day health, for better longevity, and for greater peace of mind. You are welcome to contact me for a free consultation if you would like support with improving your heart health, or with any other health issue.