URGENT! SUPPORT HR 2218 TO PROTECT CHILDREN FROM COMPULSORY DRUGGING and PROTECT PARENTAL RIGHTS AT THE SAME TIME
ACTION ITEM: http://salsa.democracyinaction.org/o/568/t/1128/campaign.jsp?campaign_KEY=27246
OPPOSE S 324 TO PROTECT PREGNANT MOTHERS AND THEIR BABIES FROM COMPULSORY SCREENING AND COERCIVE DRUGGING:
ACTION ITEM:http://salsa.democracyinaction.org/o/568/t/1128/campaign.jsp?campaign_KEY=23065
Children are the latest victims in the Drug Crimes Against Humanity. Let me share my bias with you and then tell you why I believe that babies and children are being assaulted in increasing numbers with a deadly weapon: psychotropic drugs. These drugs kill and maim at the physical, neurological, psychological and emotional levels. They have lethal and sub-lethal side effects but are, astonishingly, handed out like candy as if they were properly tested, safe or effective. They are none of the above. Click here, http://salsa.democracyinaction.org/o/568/t/1128/campaign.jsp?campaign_KEY=27246, to tell State and Federal Legislators to protect parents’ right to make medical decisions for their children and stop the use of government money for unscientific and skewed screening tests to herd kids into the drug pusher’s offices where unnecessary and dangerous prescriptions await them.
Pregnant mothers are up for “protection” from postpartum depression by being “screened” with phony screening tools and then “offered” drugs which the PDR advises doctors to avoid or use with extreme caution in women of child-bearing age. Infants exposed to these toxic compounds can suffer a horrifying range of damage, including being born with their internal organs outside of their bodies and life long brain damage. “Never mind”, says Big Pharma, “pregnant and new moms are an untapped market. Let’s go for it! And just think! Babies with brain damage, diabetes, etc., all require meds for the rest of their lives. Yes, indeed! We will surely go for it.” And go for it they did by getting the bill passed in the House of Representatives whose companion bill, S 324, is now before the Senate. Click here http://salsa.democracyinaction.org/o/568/t/1128/campaign.jsp?campaign_KEY=23065 to tell your Senators not to pass this dangerous and totally unnecessary bill.
I was graduated from the Albert Einstein College of Medicine in 1970 and took my Postgraduate training in Child, Adolescent and Adult Psychiatry, finishing my training in 1975. I am trained in psychoanalysis, group therapy and a host of other modalities. I have run drug and other treatment facilities, worked in inpatient and out patient facilities for children, adults and adolescents and have been in the private practice of psychiatry and medicine for decades.
All without drugs, electroshock or other dangerous, primitive and harmful techniques. I believe that psychoactive drugs, like virtually all other drugs, are dangerous and, unless you are in a surgery suite or an emergency room, unnecessary.
This is a conviction born out of a very long and successful drug free medical and psychiatric practice (during which, unlike most of my medical colleagues, I have never been sued for malpractice).
When I saw the article in the most recent journal of the Schafer Autism Report which is reproduced below, I wrote to congratulate the Report for publishing this outstanding piece decrying the medication of millions and millions of children for little more than mythic disorders.
Representative Ron Paul MD (TX-R) introduced the Parental Consent Act, HR 2218, on April 30, 2009. The bill prevents Federal monies from being used to support mental health screenings which are nothing short of pharma marketing tools for kids. They have no scientific validity, are supported by, and developed by, the greedy folks at Big Pharma. Kids answer trick questions in normal ways and they are “diagnosed” with phony terms and lables. Parents who resist the requirement for medication which almost always follows face enormous pressure, including jail time for “medical neglect” or “child abuse”. This sells pills, all right, but it sure does not protect rights or kids brains and bodies.
As a psychiatrist and physician I can tell you that psychoactive drugs are dangerous. They can cause permanent physical damage, obesity, suicide, homicide, diabetes, neurological damage which is life-long, rob children of their moods and their developmental opportunities and much, much more. Every single school shooting in the US has involved kids either on drugs or coming off them. There is, in my opinion, absolutely no excuse for psychoactive medicines.
Furthermore, parents have the right, and must continue to have the right, to make the life and death decisions for their children with which they have been entrusted. Those rights are fundamentally as the rights we claim for ourselves to make our own decisions about what happens to our own bodies. Absent that, our bodies are owned by others who make decisions about what happens to them and we are, by definition, slaves. I have no wish to be a slave to the government of any country or to its corporations, including Big Pharma. So it is my duty to oppose these pieces of legislation.
This is an invitation to join me in that opposition and bring all of your contacts along.
By the way, the Natural Solutions Foundation is a privately supported not for profit, tax exempt organization and we depend on your donations. Please visit http://drrimatruthreports.com/?page_id=189 to make your donation. Recurring donations are especially helpful. We appreciate your support, whether large of small.
Yours in health and freedom,
Dr. Rima
Rima E. Laibow, MD
Medical Director
Natural Solutions Foundation
www.HealthFreedomUSA.org
www.GlobalHealthFreedom.org
www.NaturalSolutionsFoundation.org
www.NaturalSolutionsMarketPlace.org
www.Organics4U.org
www.HealthFreedomRadio.com
Here is the letter I wrote to the Editor of the Schafer Autism Review:
To the Editor:
I am writing to congratulate you on your publication of “The Wholesale Sedation of America’s Youth” By Andrew M. Weiss. As a Child, Adolescent and Adult Psychiatrist who has practiced drug free medicine for my entire career, I found myself reading my own thoughts and writings in this excellent article. Physicians, Nurse Practitioners and others who endorse and enforce medicating children because they have been entrained or constrained to do so win the approval, praise and appreciation of the forces that use them and of their peers, but are, in fact, worthy of scorn and, at best, loss of licenses or, at worst, criminal prosecution for their mindless, damaging and cowardly refusal to think clearly about the needs of the children they are charged to heal, not poison.
Every doctor is trained to think logically and systematically about diagnosis and treatment. If they refuse to use that training because they have allowed themselves to be brainwashed and browbeaten into down and dirty, quick and quality-less medicine, then shame upon each and every one of them. Drug ads, phony science and cheer leader “continuing medical education” seminars are nothing short of cynical organized deceptions designed to accomplish one goal and one goal only: the generation of massive profits.
Who stands between a child and pharmaceutical damage? A doctor. Who steps aside for 8 million American children every year? A doctor. If parents object or refuse to medicate their children, they run the very real risk of being charged with medical abuse or neglect, loosing their children and/or facing criminal charges for trying to protect the vulnerable youngsters in their care. Commonly drug-company-sponsored “screening tools” used by teachers or other school personnel are what got the kids in front of the doctor or nurse practitioner staring at the dangerous end of a prescription pad.
On April 30, 2009 Representative Ron Paul (R-TX) introduced the Parental Consent Act, HR 2218, “To prohibit the use of Federal funds for any universal or mandatory mental health screening program.” The ominously Orwellian-named “New Freedoms Initiative, passed in 2004 during the drug-friendly reign of President George W. Bush, provides for mandatory screening of every child from 0 to 18. In uterine screening is accomplished by “mental health screening” of pregnant women and the compulsory drugging of those women to “protect” the unborn child despite the former cautions urged on doctors to avoid the use of psychotropic medication in women of child bearing age because of the known and unknown dangers inherent in exposing unborn or nursing babies to those drugs.
The New Freedoms Initiative also mandates the screening for “mental problems” of everyone involved in any way with children – parents, grandparents, teachers, policemen and women, merchants who sell children things, clergy, doctors, nurses, etc. In short, everyone.
The madness must stop. Doctors must think about children and childhood as a developmental process, not a disease. Parents must be free to be what the law says they are, “GUARDians” and bureaucrats and administrators, teachers and others involved with children must ask why a child is showing signs of stress or distress and look for ways to solve that problem, not dissolve the child’s mind in a chemical soup of long and short term toxicity.
The Natural Solutions Foundation, www.GlobalHealthFreedom.org and www.HealthFreedomUSA.org, of which I am proud to be the Medical Director, supports the right of every person to make their own health decisions and, of course, of parents to make those decisions for their children. And we strongly support the rights of parents and others to say “NO!” to drugs, “No” to compulsory screenings to get kids onto subjective, and profitable diagnostic conveyor belts.
Our Health Freedom Action eAlerts offer action options to concerned parents and other persons to preserve these essential rights.
Medical fascism is facing us all. Soviet Russia was condemned world-wide because it condoned the atrocious use of psychoactive drugs to control its population and prevent behaviors it found disagreeable or unwelcome in vast numbers of people. Are our children our dissidents? Do their discontents require chemical straight jackets and personality-ectomies? Have we become mindless mind-assassins, robbing our children of their emotions and their neurological developmental opportunities because we do not dare to ask the penetrating question, “WHY?” to this drug mania we have been marketed into?
Since graduation from Albert Einstein College of Medicine in 1970 and completion of my Child and Adolescent Psychiatry Fellowship in 1975 I have practiced medicine and psychiatry without resorting to drugs. The results have been nothing short of astonishing for someone trained in the “Medical Model” – my patients got well because the underlying cause of their discomforts, disabilities, distortions and difficulties were uncovered and addressed. Using intensive nutritional strategies, herbology, homeopathy, detoxification, NeuroBioFeedback, frequency medicine and a host of other techniques, each patient was treated individually and their treatment tailored to their realities, including emotional ones. This type of medicine takes time – lots of it – and therefore the cottage industry, piecework compensation which doctors have allowed insurance carriers to impose upon them (insurance carriers which are often co-owned by Big Pharma so that forcing doctors to see more patients in a shorter time is a successful marketing ploy for their shareholders’ interests) make the economics unpalatable to insurance companies. Doctors have, in the main, behaved like good serfs and allowed themselves to be made wage slaves to the interests of the insurance companies, seeing more patients in shorter slots – and writing prescriptions quickly so they can see the next patient and the next and the next.
The solution? If you are a parent, find a health care professional who does not take insurance and pay for treatment so you and the doctor can spend as much time as your child needs. If you are a doctor or nurse practitioner, rethink your slavish devotion to the medicine of convenience – yours – and start doing what you have been expensively trained to do: think about root causes, look for underlying factors and return to your roots as a healer. Yes, you will have to unlearn much and question more. But you were a bright student looking for ways to help people when you fought your way into medical school. You were, after all, the best and the brightest. You may still have the capacity to think and to discern real science from marketing. And, somewhere deep down inside you, perhaps you still have a deep commitment to service and truth.
You will quickly find, if you follow the intellectual path I am advocating, that many of your most cherished believes must be abandoned by the wayside. One of those believes is that you must continue to take insurance payment for your services or you will not make a living. In fact, those doctors who have dared to let go of the insurance teat report that they are making more money, spending less in overhead and serving patients better than they dreamed possible before they took the plunge into service, not serfdom.
Yours in health and freedom,
Rima E. Laibow, MD
Medical Director
Natural Solutions Foundation
www.HealthFreedomUSA.org
www.GlobalHealthFreedom.org
And here is the Special Edition article to which I was responding:
Special Edition
The Wholesale Sedation of America’s Youth
By Andrew M. Weiss, Skeptical Inquirer. is.gd/yXAW
In the winter of 2000, the Journal of the American Medical Association published the results of a study indicating that 200,000 two- to four-year-olds had been prescribed Ritalin for an “attention disorder” from 1991 to 1995. Judging by the response, the image of hundreds of thousands of mothers grinding up stimulants to put into the sippy cups of their preschoolers was apparently not a pretty one.
Most national magazines and newspapers covered the story; some even expressed dismay or outrage at this exacerbation of what already seemed like a juggernaut of hyper-medicalizing childhood. The public reaction, however, was tame; the medical community, after a moment’s pause, continued unfazed. Today, the total toddler count is well past one million, and influential psychiatrists have insisted that mental health prescriptions are appropriate for children as young as twelve months. For the pharmaceutical companies, this is progress.
In 1995, 2,357,833 children were diagnosed with ADHD (Woodwell 1997) — twice the number diagnosed in 1990. By 1999, 3.4 percent of all American children had received a stimulant prescription for an attention disorder. Today, that number is closer to ten percent. Stimulants aren’t the only drugs being given out like candy to our children. A variety of other psychotropics like antidepressants, antipsychotics, and sedatives are finding their way into babies’ medicine cabinets in large numbers. In fact, the worldwide market for these drugs is growing at a rate of ten percent a year, $20.7 billion in sales of antipsychotics alone (for 2007, IMSHealth 2008).
While the sheer volume of psychotropics being prescribed for children might, in and of itself, produce alarm, there has not been a substantial backlash against drug use in large part because of the widespread perception that “medically authorized” drugs must be safe. Yet, there is considerable evidence that psychoactive drugs do not take second place to other controlled pharmaceuticals in carrying grave and substantial risks. All classes of psychoactive drugs are associated with patient deaths, and each produces serious side effects, some of which are life-threatening.
In 2005, researchers analyzed data from 250,000 patients in the Netherlands and concluded that “we can be reasonably sure that antipsychotics are associated in something like a threefold increase in sudden cardiac death, and perhaps that older antipsychotics may be worse” (Straus et al. 2004). In 2007, the FDA chose to beef up its black box warning (reserved for substances that represent the most serious danger to the public) against antidepressants concluding, “the trend across age groups toward an association between antidepressants and suicidality . . . was convincing, particularly when superimposed on earlier analyses of data on adolescents from randomized, controlled trials” (Friedman and Leon 2007). Antidepressants have been banned for use with children in the UK since 2003. According to a confidential FDA report, prolonged administration of amphetamines (the standard treatment for ADD and ADHD) “may lead to drug dependence and must be avoided.” They further reported that “misuse of amphetamine may cause sudden death and serious cardiovascular adverse events” (Food and Drug Administration 2005). The risk of fatal toxicity from lithium carbonate, a not uncommon treatment for bipolar disorder, has been well documented since the 1950s. Incidents of fatal seizures from sedative-hypnotics, especially when mixed with alcohol, have been recorded since the 1920s.
Psychotropics carry nonfatal risks as well. Physical dependence and severe withdrawal symptoms are associated with virtually all psychoactive drugs. Psychological addiction is axiomatic. Concomitant side effects range from unpleasant to devastating, including: insulin resistance, narcolepsy, tardive dyskenisia (a movement disorder affecting 15–20 percent of antipsychotic patients where there are uncontrolled facial movements and sometimes jerking or twisting movements of other body parts), agranulocytosis (a reduction in white blood cells, which is life threatening), accelerated appetite, vomiting, allergic reactions, uncontrolled blinking, slurred speech, diabetes, balance irregularities, irregular heartbeat, chest pain, sleep disorders, fever, and severe headaches. The attempt to control these side effects has resulted in many children taking as many as eight additional drugs every day, but in many cases, this has only compounded the problem. Each “helper” drug produces unwanted side effects of its own.
The child drug market has also spawned a vigorous black market in high schools and colleges, particularly for stimulants. Students have learned to fake the symptoms of ADD in order to obtain amphetamine prescriptions that are subsequently sold to fellow students. Such “shopping” for prescription drugs has even spawned a new verb. The practice is commonly called “pharming.” A 2005 report from the Partnership for a Drug Free America, based on a survey of more than 7,300 teenagers, found one in ten teenagers, or 2.3 million young people, had tried prescription stimulants without a doctor’s order, and 29 percent of those surveyed said they had close friends who have abused prescription stimulants.
In a larger sense, the whole undertaking has had the disturbing effect of making drug use an accepted part of childhood. Few cultures anywhere on earth and anytime in the past have been so willing to provide stimulants and sedative-hypnotics to their offspring, especially at such tender ages. An entire generation of young people has been brought up to believe that drug-seeking behavior is both rational and respectable and that most psychological problems have a pharmacological solution. With the ubiquity of psychotropics, children now have the means, opportunity, example, and encouragement to develop a lifelong habit of self-medicating.
Common population estimates include at least eight million children, ages two to eighteen, receiving prescriptions for ADD, ADHD, bipolar disorder, autism, simple depression, schizophrenia, and the dozens of other disorders now included in psychiatric classification manuals. Yet sixty years ago, it was virtually impossible for a child to be considered mentally ill. The first diagnostic manual published by American psychiatrists in 1952, DSM-I, included among its 106 diagnoses only one for a child: Adjustment Reaction of Childhood/Adolescence. The other 105 diagnoses were specifically for adults. The number of children actually diagnosed with a mental disorder in the early 1950s would hardly move today’s needle. There were, at most, 7,500 children in various settings who were believed to be mentally ill at that time, and most of these had explicit neurological symptoms.
Of course, if there really are one thousand times as many kids with authentic mental disorders now as there were fifty years ago, then the explosion in drug prescriptions in the years since only indicates an appropriate medical response to a newly recognized pandemic, but there are other possible explanations for this meteoric rise. The last fifty years has seen significant social changes, many with a profound effect on children. Burgeoning birth rates, the decline of the extended family, widespread divorce, changing sexual and social mores, households with two working parents — it is fair to say that the whole fabric of life took on new dimensions in the last half century. The legal drug culture, too, became an omnipresent adjunct to daily existence. Stimulants, analgesics, sedatives, decongestants, penicillins, statins, diuretics, antibiotics, and a host of others soon found their way into every bathroom cabinet, while children became frequent visitors to the family physician for drugs and vaccines that we now believe are vital to our health and happiness. There is also the looming motive of money. The New York Times reported in 2005 that physicians who had received substantial payments from pharmaceutical companies were five times more likely to prescribe a drug regimen to a child than those who had refused such payments.
So other factors may well have contributed to the upsurge in psychiatric diagnoses over the past fifty years. But even if the increase reflects an authentic epidemic of mental health problems in our children, it is not certain that medication has ever been the right way to handle it. The medical “disease” model is one approach to understanding these behaviors, but there are others, including a hastily discarded psychodynamic model that had a good record of effective symptom relief. Alternative, less invasive treatments, too, like nutritional treatments, early intervention, and teacher and parent training programs were found to be at least as effective as medication in long-term reduction of a variety of symptoms (of ADHD, The MTA Cooperative Group 1999).
Nevertheless, the medical-pharmaceutical alliance has largely shrugged off other approaches and scoffed at the potential for conflicts of interest and continues to medicate children in ever-increasing numbers. With the proportion of diagnosed kids growing every month, it may be time to take another look at the practice and soberly reflect on whether we want to continue down this path. In that spirit, it is not unreasonable to ask whether this exponential expansion in medicating children has another explanation altogether. What if children are the same as they always were? After all, virtually every symptom now thought of as diagnostic was once an aspect of temperament or character. We may not have liked it when a child was sluggish, hyperactive, moody, fragile, or pestering, but we didn’t ask his parents to medicate him with powerful chemicals either. What if there is no such thing as mental illness in children (except the small, chronic, often neurological minority we once recognized)? What if it is only our perception of childhood that has changed? To answer this, we must look at our history and at our nature.
The human inclination to use psychoactive substances predates civilization. Alcohol has been found in late Stone Age jugs; beer may have been fermented before the invention of bread. Nicotine metabolites have been found in ancient human remains and in pipes in the Near East and Africa. Knowledge of Hul Gil, the “joy plant,” was passed from the Sumerians, in the fifth millennium b.c.e., to the Assyrians, then in serial order to the Babylonians, Egyptians, Greeks, Persians, Indians, then to the Portuguese who would introduce it to the Chinese, who grew it and traded it back to the Europeans. Hul Gil was the Sumerian name for the opium poppy. Before the Middle Ages, economies were established around opium, and wars were fought to protect avenues of supply.
With the modern science of chemistry in the nineteenth century, new synthetic substances were developed that shared many of the same desirable qualities as the more traditional sedatives and stimulants. The first modern drugs were barbiturates — a class of 2,500 sedative/hypnotics that were first synthesized in 1864. Barbiturates became very popular in the U.S. for depression and insomnia, especially after the temperance movement resulted in draconian anti-drug legislation (most notoriously Prohibition) just after World War I. But variety was limited and fears of death by convulsion and the Winthrop drug-scare kept barbiturates from more general distribution.
Stimulants, typically caffeine and nicotine, were already ubiquitous in the first half of the twentieth century, but more potent varieties would have to wait until amphetamines came into widespread use in the 1930s. Amphetamines were not widely known until the 1920s and 1930s when they were first used to treat asthma, hay fever, and the common cold. In 1932, the Benzedrine Inhaler was introduced to the market and was a huge over-the-counter success. With the introduction of Dexedrine in the form of small, cheap pills, amphetamines were prescribed for depression, Parkinson’s disease, epilepsy, motion sickness, night-blindness, obesity, narcolepsy, impotence, apathy, and, of course, hyperactivity in children.
Amphetamines came into still wider use during World War II, when they were given out freely to GIs for fatigue. When the GIs returned home, they brought their appetite for stimulants to their family physicians. By 1962, Americans were ingesting the equivalent of forty-three ten-milligram doses of amphetamine per person annually (according to FDA manufacturer surveys).
Still, in the 1950s, the family physician’s involvement in furnishing psychoactive medications for the treatment of primarily psychological complaints was largely sub rosa. It became far more widespread and notorious in the 1960s. There were two reasons for this. First, a new, safer class of sedative hypnotics, the benzodiazepines, including Librium and Valium, were an instant sensation, especially among housewives who called them “mothers’ helpers.” Second, amphetamines had finally been approved for use with children (their use up to that point had been “off-label,” meaning that they were prescribed despite the lack of FDA authorization).
Pharmaceutical companies, coincidentally, became more aggressive in marketing their products with the tremendous success of amphetamines. Valium was marketed directly to physicians and indirectly through a public relations campaign that implied that benzodiazepines offered sedative/hypnotic benefits without the risk of addiction or death from drug interactions or suicide. Within fifteen years of its introduction, 2.3 billion Valium pills were being sold annually in the U.S. (Sample 2005).
So, family physicians became society’s instruments: the suppliers of choice for legal mood-altering drugs. But medical practitioners required scientific authority to protect their reputations, and the public required a justification for its drug-seeking behavior. The pharmaceutical companies were quick to offer a pseudo scientific conjecture that satisfied both. They argued that neurochemical transmitters, only recently identified, were in fact the long sought after mediators of mood and activity. Psychological complaints, consequently, were a function of an imbalance of these neural chemicals that could be corrected with stimulants and sedatives (and later antidepressants and antipsychotics). While the assertion was pure fantasy without a shred of evidence, so little was known about the brain’s true actions that the artifice was tamely accepted. This would later prove devastating when children became the targets of pharmaceutical expansion.
With Ritalin’s FDA approval for the treatment of hyperactivity in children, the same marketing techniques that had been so successful with other drugs were applied to the new amphetamine. Pharmaceutical companies had a vested interest in the increase in sales; they spared no expense in convincing physicians to prescribe them. Cash payments, stock options, paid junkets, no-work consultancies, and other inducements encouraged physicians to relax their natural caution about medicating children. Parents also were targeted. For example, CIBA, the maker of Ritalin, made large direct payments to parents’ support groups like CHADD (Children and Adults with Attention Deficit/Hyperactivity Disorder) (The Merrow Report 1995). To increase the acceptance of stimulants, drug companies paid researchers to publish favorable articles on the effectiveness of stimulant treatments. They also endowed chairs and paid for the establishment of clinics in influential medical schools, particularly ones associated with universities of international reputation. By the mid 1970s, more than half a million children had already been medicated primarily for hyperactivity.
The brand of psychiatry that became increasingly popular in the 1980s and 1990s did not have its roots in notions of normal behavior or personality theory; it grew out of the concrete, atheoretical treatment style used in clinics and institutions for the profoundly disturbed. German psychiatrist Emil Kraepelin, not Freud, was the God of mental hospitals, and pharmaceuticals were the panacea. So the whole underlying notion of psychiatric treatment, diagnosis, and disease changed. Psychiatry, which had straddled psychology and medicine for a hundred years, abruptly abandoned psychology for a comfortable sinecure within its traditional parent discipline. The change was profound.
People seeking treatment were no longer clients, they were patients. Their complaints were no longer suggestive of a complex mental organization, they were symptoms of a disease. Patients were not active participants in a collaborative treatment, they were passive recipients of symptom-reducing substances. Mental disturbances were no longer caused by unique combinations of personality, character, disposition, and upbringing, they were attributed to pre-birth anomalies that caused vague chemical imbalances. Cures were no longer anticipated or sought; mental disorders were inherited illnesses, like birth defects, that could not be cured except by some future magic, genetic bullet. All that could be done was to treat symptoms chemically, and this was being done with astonishing ease and regularity.
In many ways, children are the ideal patients for drugs. By nature, they are often passive and compliant when told by a parent to take a pill. Children are also generally optimistic and less likely to balk at treatment than adults. Even if they are inclined to complain, the parent is a ready intermediary between the physician and the patient. Parents are willing to participate in the enforcement of treatments once they have justified them in their own minds and, unlike adults, many kids do not have the luxury of discontinuing an unpleasant medication. Children are additionally not aware of how they ought to feel. They adjust to the drugs’ effects as if they are natural and are more tolerant of side effects than adults. Pharmaceutical companies recognized these assets and soon were targeting new drugs specifically at children.
But third-party insurance providers balked at the surge in costs for treatment of previously unknown, psychological syndromes, especially since unwanted drug effects were making some cases complicated and expensive. Medicine’s growing prosperity as the purveyor of treatments for mental disorders was threatened, and the industry’s response was predictable. Psychiatry found that it could meet insurance company requirements by simplifying diagnoses, reducing identification to the mere appearance of certain symptoms. By 1980, they had published all new standards.
Lost in the process was the fact that the redefined diagnoses (and a host of new additions) failed to meet minimal standards of falsifiability and differentiability. This meant that the diagnoses could never be disproved and that they could not be indisputably distinguished from one another. The new disorders were also defined as lists of symptoms from which a physician could check off a certain number of hits like a Chinese menu, which led to reification, an egregious scientific impropriety. Insurers, however, with their exceptions undermined and under pressure from parents and physicians, eventually withdrew their objections. From that moment on, the treatment of children with powerful psychotropic medications grew unchecked.
As new psychotropics became available, their uses were quickly extended to children despite, in many cases, indications that the drugs were intended for use with adults only. New antipsychotics, the atypicals, were synthesized and marketed beginning in the 1970s. Subsequently, a new class of antidepressants like Prozac and Zoloft was introduced. These drugs were added to the catalogue of childhood drug treatments with an astonishing casualness even as stimulant treatment for hyperactivity continued to burgeon.
In 1980, hyperactivity, which had been imprudently named “minimal brain dysfunction” in the 1960s, was renamed Attention Deficit Disorder in order to be more politic, but there was an unintended consequence of the move. Parents and teachers, familiar with the name but not always with the symptoms, frequently misidentified children who were shy, slow, or sad (introverted rather than inattentive) as suffering from ADD. Rather than correct the mistake, though, some enterprising physicians responded by prescribing the same drug for the opposite symptoms. This was justified on the grounds that stimulants, which were being offered because they slowed down hyperactive children, might very well have the predicted effect of speeding up under-active kids. In this way, a whole new population of children became eligible for medication. Later, the authors of DSM-III memorialized this practice by renaming ADD again, this time as ADHD, and redefining ADD as inattention. Psychiatry had reached a new level: they were now willing to invent an illness to justify a treatment. It would not be the last time this was done.
In the last twenty years, a new, more disturbing trend has become popular: the re-branding of legacy forms of mental disturbance as broad categories of childhood illness. Manic depressive illness and infantile autism, two previously rare disorders, were redefined through this process as “spectrum” illnesses with loosened criteria and symptom lists that cover a wide range of previously normal behavior. With this slim justification in place, more than a million children have been treated with psychotropics for bipolar disorder and another 200,000 for autism. A recent article in this magazine “The Bipolar Bamboozle” (Flora and Bobby 2008) illuminates how and why an illness that once occurred twice in every 100,000 Americans, has been recast as an epidemic affecting millions.
To overwhelmed parents, drugs solve a whole host of ancillary problems. The relatively low cost (at least in out-of-pocket dollars) and the small commitment of time for drug treatments make them attractive to parents who are already stretched thin by work and home life. Those whose confidence is shaken by indications that their children are “out of control” or “unruly” or “disturbed” are soothed by the seeming inevitability of an inherited disease that is shared by so many others. Rather than blaming themselves for being poor home managers, guardians with insufficient skills, or neglectful caretakers, parents can find comfort in the thought that their child, through no fault of theirs, has succumbed to a modern and widely accepted scourge. A psychiatric diagnosis also works well as an authoritative response to demands made by teachers and school administrators to address their child’s “problems.”
Once a medical illness has been identified, all unwanted behavior becomes fruit of the same tree. Even the children themselves are often at first relieved that their asocial or antisocial impulses reflect an underlying disease and not some flaw in their characters or personalities.
Conclusions In the last analysis, childhood has been thoroughly and effectively redefined. Character and temperament have been largely removed from the vocabulary of human personality. Virtually every single undesirable impulse of children has taken on pathological proportions and diagnostic significance. Yet, if the psychiatric community is wrong in their theories and hypotheses, then a generation of parents has been deluded while millions of children have been sentenced to a lifetime of ingesting powerful and dangerous drugs.
Considering the enormous benefits reaped by the medical community, it is no surprise that critics have argued that the whole enterprise is a cynical, reckless artifice crafted to unfairly enrich them. Even though this is undoubtedly not true, physicians and pharmaceutical companies must answer for the rush to medicate our most vulnerable citizens based on little evidence, a weak theoretical model, and an antiquated and repudiated philosophy. For its part, the scientific community must answer for its timidity in challenging treatments made in the absence of clinical observation and justified by research of insufficient rigor performed by professionals and institutions whose objectivity is clearly in question, because their own interests are materially entwined in their findings.
It should hardly be necessary to remind physicians that even if their diagnoses are real, they are still admonished by Galen’s dictum Primum non nocere, or “first, do no harm.” If with no other population, this ought to be our standard when dealing with children. Yet we have chosen the most invasive, destructive, and potentially lethal treatment imaginable while rejecting other options that show great promise of being at least as effective and far safer. But these other methods are more expensive, more complicated, and more time-consuming, and thus far, we have not proved willing to bear the cost. Instead, we have jumped at a discounted treatment, a soft-drink-machine cure: easy, cheap, fast, and putatively scientific. Sadly, the difference in price is now being paid by eight million children.
Mental illness is a fact of life, and it is naïve to imagine that there are not seriously disturbed children in every neighborhood and school. What is more, in the straitened economy of child rearing and education, medication may be the most efficient and cost effective treatment for some of these children. Nevertheless, to medicate not just the neediest, most complicated cases but one child in every ten, despite the availability of less destructive treatments and regardless of doubtful science, is a tragedy of epic proportions.
What we all have to fear, at long last, is not having been wrong but having done wrong. That will be judged in a court of a different sort. Instead of humility, we continue to feed drugs to our children with blithe indifference. Even when a child’s mind is truly disturbed (and our standards need to be revised drastically on this score), a treatment model that intends to chemically palliate and manage ought to be our last resort, not our first option. How many more children need to be sacrificed for us to see the harm in expediency, greed, and plain ignorance?
Schafer Autism Review
http://www.sarnet.org/lib/todaySAR.htm
The Natural Solutions Foundation has been urging the US to examine its food policies in favor of clean, unadulterated, locally grown, GMO free foods for years. We have asked supporters to write letters, met with senior Congressional Aides and members of Congress, attended Codex meetings where FDA and USDA representatives foster the worst of the worst of the multinational interests with respect to adulterated food and enhanced profits.
All along, we have been educating our supporters, who number in the hundreds of thousands, and others as well, to understand that the economic, social, personal and national impact of a degraded food system is the destruction not only of the individual, but the entire society.
If people are dying or dead, or caring for the ill, they cannot go to school, work or carry out the essential functions of a society. If 16% of the GNP goes, at it does in America, for health care that does not care about health, but profits only from illness, and food, the only source of nutrition and health, is contaminated for the sake of profit, and at the same time that nation has just about the worst health of any developed nation, despite all the wildly expensive “care” something is rotten in Denmark, or, rather, the US. And what is rotten is our food.
Our chemicalized, synthesized, devitalized, devalued and destroyed food is, in fact, what is wrong. Without nutrition the immune system flags and falters. Without nutrition, the brain does not function well, Without nutrition the reproductive systems grinds to a halt.
Without nutrition, the eyes grow dim. Obvious but true: synthetic food does not provide nutitional sufficiency. Food that is transported a half a world away looses its nutritional value.
People who eat food made from GMOs ingest, incorporate and keep within them the seeds of their own destruction and that of any child they might bear.
Science is clear. But profit is, apparently, clearer.
Cheap food is not good food. Cheap food is expensive social degreedation and expensive disease. Very, very expensive disease.
And that is, perhaps a good point to remember: Back in 1952 the head of Germany’s Bayer Pharmaceutical, Fritz ter Meer, brought a letter to the UN signed by 5 pharmaceutical executives who haD, like ter Meer, all gone to prison at the end of the Second World War for crimes against humanity and who were now, once again, working for pharmaceutical firms.
Chief executives (and, in ter Meer’s case, the head) of the great civilian German war machine “I G Farben”, these pharmaceutical executives knew well that to accomplish the dream of world domination and cleansing which the Third Reich’s fall left unfinished, they would need to control – and kill – much of the world’s population.
What better way than food? So they urged the UN, in their letter, to take control of the world’s food. He who controls the world’s food, after all, controls the world. And pharmaceutical executives, whose legal responsibility to their share holders have, after all, no interest at all in healthy food. Healthy food makes healthy people and they are poor customers for the diseases which fuel the astronomical profits of the pharmaceutical industry – the preventable, non communicable diseases of under nutrition, as the World Health Organization calls them. These diseases kill an increasing portion of the world’s people as the world converts to Codex-compliant, USDA and FDA approved “food” which weakens and sickens us individually and in our body politic.
It is the drug lord’s gambit, now writ large through the participation of the biotech industry, the factory farming industry, the pesticide industry, the veterinary drug industry (Big Pharma again, because more drugs are used annually for animals than for people), the irradiation industry and the Chemical industry. Codex is part of the picture. Codex was born from that impulse.
Visit Nutricide, http://video.google.com/videoplay?docid=-5266884912495233634 to learn more about the origins and impact of Codex Alimentarius (the World Food Code) on your health and the world’s.
Please read below this posting for more information on how to take back the world’s food production, put it back in the capable hands of farmers and reverse the devastating nutrition-based illness trends which will be responsible for 75 % of the world’s people by 2025, according to the joint publication of the World Health Organization and the Food and Agriculture Organization’s,
The Role of Diet and Exercise in the Prevention of Chronic Disease
visit www.NaturalSolutionsFoundation.org to learn about the Natural Solutions Foundation’s International Decade of Nutrition and its Valley of the Moon(TM) Eco Demonstration Community in the highlands of Panama’s Chiriqui Highlands.
WHO/FAO’s joint report on the impact of the PREVENTABLE, non communicable chronic degenerative diseases of under nutrition “It has been projected that, by 2020, chronic diseases will account for
almost three-quarters of all deaths worldwide, and that 71% of deaths due to ischaemic heart disease (IHD), 75% of deaths due to stroke, and 70% of deaths due to diabetes will occur in developing countries (4). The number of people in the developing world with diabetes will increase by more than 2.5-fold, from 84 million in 1995 to 228 million in 2025 (5). On a global basis, 60% of the burden of chronic diseases will occur in developing countries.” reaching the proportions already attained in the developed world for these diseases of under nutrition.
Then National Solutions Foundation strongly supports taking back the production of food from the multinational corporations who are, literally, killing us and putting it back into the hands and lands of people who know, and love, the food they grow and are part of the communities they serve. That’s what the International Decade of Nutrition is all about and that is the reason that the Valley of the Moon(TM) Eco Community will house not only a BeyondOrganic(TM) Bio Dynamic Zero Emissions Farm, but a farm school as well.
Please give generously to the Natural Solutions Foundation health freedom and International Decade of Nutrition activities. Click here (http://drrimatruthreports.com/index.php?page_id=189) to make your tax deductible recurring donation.
And click here (http://drrimatruthreports.com/?page_id=1130) to purchase chemical free Valley of the Moon(TM) Chemical Free Coffee, A little bit of heaven in a cup(c). Every bag gives you a 1/2 lb of the world’s best chemical free coffee and gives you a tax deduction, too!
Thanks for your support.
Yours in health and freedom,
Dr. Rima
Rima E. Laibow, MD
Medical Director
Natural Solutions Foundation
www.HealthFreedomUSA.org
www.GlobalHealthFreedom.org
www.NaturalSolutionsFoundation.org
www.Organics4U.org
www.NaturalSolutionsMarketPlace.org
www.NaturalSolutionsMedia.tv
Farmer in Chief
Michael Pollan, The New York Times
Thursday 09 October 2008
(Copyright – New York Times)
[Reproduced for Educational purposes.]
Federal policies to promote maximum production of commodity crops such as wheat, from which most of our supermarket foods are derived, have succeeded in keeping prices low. But suddenly the era of cheap and abundant food appears to be drawing to a close.
Dear Mr. President-Elect,
It may surprise you to learn that among the issues that will occupy much of your time in the coming years is one you barely mentioned during the campaign: food. Food policy is not something American presidents have had to give much thought to, at least since the Nixon administration – the last time high food prices presented a serious political peril. Since then, federal policies to promote maximum production of the commodity crops (corn, soybeans, wheat and rice) from which most of our supermarket foods are derived have succeeded impressively in keeping prices low and food more or less off the national political agenda. But with a suddenness that has taken us all by surprise, the era of cheap and abundant food appears to be drawing to a close. What this means is that you, like so many other leaders through history, will find yourself confronting the fact – so easy to overlook these past few years – that the health of a nation’s food system is a critical issue of national security. Food is about to demand your attention.
Complicating matters is the fact that the price and abundance of food are not the only problems we face; if they were, you could simply follow Nixon’s example, appoint a latter-day Earl Butz as your secretary of agriculture and instruct him or her to do whatever it takes to boost production. But there are reasons to think that the old approach won’t work this time around; for one thing, it depends on cheap energy that we can no longer count on. For another, expanding production of industrial agriculture today would require you to sacrifice important values on which you did campaign. Which brings me to the deeper reason you will need not simply to address food prices but to make the reform of the entire food system one of the highest priorities of your administration: unless you do, you will not be able to make significant progress on the health care crisis, energy independence or climate change. Unlike food, these are issues you did campaign on – but as you try to address them you will quickly discover that the way we currently grow, process and eat food in America goes to the heart of all three problems and will have to change if we hope to solve them. Let me explain.
After cars, the food system uses more fossil fuel than any other sector of the economy – 19 percent. And while the experts disagree about the exact amount, the way we feed ourselves contributes more greenhouse gases to the atmosphere than anything else we do – as much as 37 percent, according to one study. Whenever farmers clear land for crops and till the soil, large quantities of carbon are released into the air. But the 20th-century industrialization of agriculture has increased the amount of greenhouse gases emitted by the food system by an order of magnitude; chemical fertilizers (made from natural gas), pesticides (made from petroleum), farm machinery, modern food processing and packaging and transportation have together transformed a system that in 1940 produced 2.3 calories of food energy for every calorie of fossil-fuel energy it used into one that now takes 10 calories of fossil-fuel energy to produce a single calorie of modern supermarket food. Put another way, when we eat from the industrial-food system, we are eating oil and spewing greenhouse gases. This state of affairs appears all the more absurd when you recall that every calorie we eat is ultimately the product of photosynthesis – a process based on making food energy from sunshine. There is hope and possibility in that simple fact.
In addition to the problems of climate change and America’s oil addiction, you have spoken at length on the campaign trail of the health care crisis. Spending on health care has risen from 5 percent of national income in 1960 to 16 percent today, putting a significant drag on the economy. The goal of ensuring the health of all Americans depends on getting those costs under control. There are several reasons health care has gotten so expensive, but one of the biggest, and perhaps most tractable, is the cost to the system of preventable chronic diseases. Four of the top 10 killers in America today are chronic diseases linked to diet: heart disease, stroke, Type 2 diabetes and cancer. It is no coincidence that in the years national spending on health care went from 5 percent to 16 percent of national income, spending on food has fallen by a comparable amount – from 18 percent of household income to less than 10 percent. While the surfeit of cheap calories that the U.S. food system has produced since the late 1970s may have taken food prices off the political agenda, this has come at a steep cost to public health. You cannot expect to reform the health care system, much less expand coverage, without confronting the public-health catastrophe that is the modern American diet.
The impact of the American food system on the rest of the world will have implications for your foreign and trade policies as well. In the past several months more than 30 nations have experienced food riots, and so far one government has fallen. Should high grain prices persist and shortages develop, you can expect to see the pendulum shift decisively away from free trade, at least in food. Nations that opened their markets to the global flood of cheap grain (under pressure from previous administrations as well as the World Bank and the I.M.F.) lost so many farmers that they now find their ability to feed their own populations hinges on decisions made in Washington (like your predecessor’s precipitous embrace of biofuels) and on Wall Street. They will now rush to rebuild their own agricultural sectors and then seek to protect them by erecting trade barriers. Expect to hear the phrases “food sovereignty” and “food security” on the lips of every foreign leader you meet. Not only the Doha round, but the whole cause of free trade in agriculture is probably dead, the casualty of a cheap food policy that a scant two years ago seemed like a boon for everyone. It is one of the larger paradoxes of our time that the very same food policies that have contributed to overnutrition in the first world are now contributing to undernutrition in the third. But it turns out that too much food can be nearly as big a problem as too little – a lesson we should keep in mind as we set about designing a new approach to food policy.
Rich or poor, countries struggling with soaring food prices are being forcibly reminded that food is a national-security issue. When a nation loses the ability to substantially feed itself, it is not only at the mercy of global commodity markets but of other governments as well. At issue is not only the availability of food, which may be held hostage by a hostile state, but its safety: as recent scandals in China demonstrate, we have little control over the safety of imported foods. The deliberate contamination of our food presents another national-security threat. At his valedictory press conference in 2004, Tommy Thompson, the secretary of health and human services, offered a chilling warning, saying, “I, for the life of me, cannot understand why the terrorists have not attacked our food supply, because it is so easy to do.”
This, in brief, is the bad news: the food and agriculture policies you’ve inherited – designed to maximize production at all costs and relying on cheap energy to do so – are in shambles, and the need to address the problems they have caused is acute. The good news is that the twinned crises in food and energy are creating a political environment in which real reform of the food system may actually be possible for the first time in a generation. The American people are paying more attention to food today than they have in decades, worrying not only about its price but about its safety, its provenance and its healthfulness. There is a gathering sense among the public that the industrial-food system is broken. Markets for alternative kinds of food – organic, local, pasture-based, humane – are thriving as never before. All this suggests that a political constituency for change is building and not only on the left: lately, conservative voices have also been raised in support of reform. Writing of the movement back to local food economies, traditional foods (and family meals) and more sustainable farming, The American Conservative magazine editorialized last summer that “this is a conservative cause if ever there was one.”
There are many moving parts to the new food agenda I’m urging you to adopt, but the core idea could not be simpler: we need to wean the American food system off its heavy 20th-century diet of fossil fuel and put it back on a diet of contemporary sunshine. True, this is easier said than done – fossil fuel is deeply implicated in everything about the way we currently grow food and feed ourselves. To put the food system back on sunlight will require policies to change how things work at every link in the food chain: in the farm field, in the way food is processed and sold and even in the American kitchen and at the American dinner table. Yet the sun still shines down on our land every day, and photosynthesis can still work its wonders wherever it does. If any part of the modern economy can be freed from its dependence on oil and successfully resolarized, surely it is food.
How We Got Here
Before setting out an agenda for reforming the food system, it’s important to understand how that system came to be – and also to appreciate what, for all its many problems, it has accomplished. What our food system does well is precisely what it was designed to do, which is to produce cheap calories in great abundance. It is no small thing for an American to be able to go into a fast-food restaurant and to buy a double cheeseburger, fries and a large Coke for a price equal to less than an hour of labor at the minimum wage – indeed, in the long sweep of history, this represents a remarkable achievement.
It must be recognized that the current food system – characterized by monocultures of corn and soy in the field and cheap calories of fat, sugar and feedlot meat on the table – is not simply the product of the free market. Rather, it is the product of a specific set of government policies that sponsored a shift from solar (and human) energy on the farm to fossil-fuel energy.
Did you notice when you flew over Iowa during the campaign how the land was completely bare – black – from October to April? What you were seeing is the agricultural landscape created by cheap oil. In years past, except in the dead of winter, you would have seen in those fields a checkerboard of different greens: pastures and hayfields for animals, cover crops, perhaps a block of fruit trees. Before the application of oil and natural gas to agriculture, farmers relied on crop diversity (and photosynthesis) both to replenish their soil and to combat pests, as well as to feed themselves and their neighbors. Cheap energy, however, enabled the creation of monocultures, and monocultures in turn vastly increased the productivity both of the American land and the American farmer; today the typical corn-belt farmer is single-handedly feeding 140 people.
This did not occur by happenstance. After World War II, the government encouraged the conversion of the munitions industry to fertilizer – ammonium nitrate being the main ingredient of both bombs and chemical fertilizer – and the conversion of nerve-gas research to pesticides. The government also began subsidizing commodity crops, paying farmers by the bushel for all the corn, soybeans, wheat and rice they could produce. One secretary of agriculture after another implored them to plant “fence row to fence row” and to “get big or get out.”
The chief result, especially after the Earl Butz years, was a flood of cheap grain that could be sold for substantially less than it cost farmers to grow because a government check helped make up the difference. As this artificially cheap grain worked its way up the food chain, it drove down the price of all the calories derived from that grain: the high-fructose corn syrup in the Coke, the soy oil in which the potatoes were fried, the meat and cheese in the burger.
Subsidized monocultures of grain also led directly to monocultures of animals: since factory farms could buy grain for less than it cost farmers to grow it, they could now fatten animals more cheaply than farmers could. So America’s meat and dairy animals migrated from farm to feedlot, driving down the price of animal protein to the point where an American can enjoy eating, on average, 190 pounds of meat a year – a half pound every day.
But if taking the animals off farms made a certain kind of economic sense, it made no ecological sense whatever: their waste, formerly regarded as a precious source of fertility on the farm, became a pollutant – factory farms are now one of America’s biggest sources of pollution. As Wendell Berry has tartly observed, to take animals off farms and put them on feedlots is to take an elegant solution – animals replenishing the fertility that crops deplete – and neatly divide it into two problems: a fertility problem on the farm and a pollution problem on the feedlot. The former problem is remedied with fossil-fuel fertilizer; the latter is remedied not at all.
What was once a regional food economy is now national and increasingly global in scope – thanks again to fossil fuel. Cheap energy – for trucking food as well as pumping water – is the reason New York City now gets its produce from California rather than from the “Garden State” next door, as it did before the advent of Interstate highways and national trucking networks. More recently, cheap energy has underwritten a globalized food economy in which it makes (or rather, made) economic sense to catch salmon in Alaska, ship it to China to be filleted and then ship the fillets back to California to be eaten; or one in which California and Mexico can profitably swap tomatoes back and forth across the border; or Denmark and the United States can trade sugar cookies across the Atlantic. About that particular swap the economist Herman Daly once quipped, “Exchanging recipes would surely be more efficient.”
Whatever we may have liked about the era of cheap, oil-based food, it is drawing to a close. Even if we were willing to continue paying the environmental or public-health price, we’re not going to have the cheap energy (or the water) needed to keep the system going, much less expand production. But as is so often the case, a crisis provides opportunity for reform, and the current food crisis presents opportunities that must be seized.
In drafting these proposals, I’ve adhered to a few simple principles of what a 21st-century food system needs to do. First, your administration’s food policy must strive to provide a healthful diet for all our people; this means focusing on the quality and diversity (and not merely the quantity) of the calories that American agriculture produces and American eaters consume. Second, your policies should aim to improve the resilience, safety and security of our food supply. Among other things, this means promoting regional food economies both in America and around the world. And lastly, your policies need to reconceive agriculture as part of the solution to environmental problems like climate change.
These goals are admittedly ambitious, yet they will not be difficult to align or advance as long as we keep in mind this One Big Idea: most of the problems our food system faces today are because of its reliance on fossil fuels, and to the extent that our policies wring the oil out of the system and replace it with the energy of the sun, those policies will simultaneously improve the state of our health, our environment and our security.
I. Resolarizing the American Farm
What happens in the field influences every other link of the food chain on up to our meals – if we grow monocultures of corn and soy, we will find the products of processed corn and soy on our plates. Fortunately for your initiative, the federal government has enormous leverage in determining exactly what happens on the 830 million acres of American crop and pasture land.
Today most government farm and food programs are designed to prop up the old system of maximizing production from a handful of subsidized commodity crops grown in monocultures. Even food-assistance programs like WIC and school lunch focus on maximizing quantity rather than quality, typically specifying a minimum number of calories (rather than maximums) and seldom paying more than lip service to nutritional quality. This focus on quantity may have made sense in a time of food scarcity, but today it gives us a school-lunch program that feeds chicken nuggets and Tater Tots to overweight and diabetic children.
Your challenge is to take control of this vast federal machinery and use it to drive a transition to a new solar-food economy, starting on the farm. Right now, the government actively discourages the farmers it subsidizes from growing healthful, fresh food: farmers receiving crop subsidies are prohibited from growing “specialty crops” – farm-bill speak for fruits and vegetables. (This rule was the price exacted by California and Florida produce growers in exchange for going along with subsidies for commodity crops.) Commodity farmers should instead be encouraged to grow as many different crops – including animals – as possible. Why? Because the greater the diversity of crops on a farm, the less the need for both fertilizers and pesticides.
The power of cleverly designed polycultures to produce large amounts of food from little more than soil, water and sunlight has been proved, not only by small-scale “alternative” farmers in the United States but also by large rice-and-fish farmers in China and giant-scale operations (up to 15,000 acres) in places like Argentina. There, in a geography roughly comparable to that of the American farm belt, farmers have traditionally employed an ingenious eight-year rotation of perennial pasture and annual crops: after five years grazing cattle on pasture (and producing the world’s best beef), farmers can then grow three years of grain without applying any fossil-fuel fertilizer. Or, for that matter, many pesticides: the weeds that afflict pasture can’t survive the years of tillage, and the weeds of row crops don’t survive the years of grazing, making herbicides all but unnecessary. There is no reason – save current policy and custom – that American farmers couldn’t grow both high-quality grain and grass-fed beef under such a regime through much of the Midwest. (It should be noted that today’s sky-high grain prices are causing many Argentine farmers to abandon their rotation to grow grain and soybeans exclusively, an environmental disaster in the making.)
Federal policies could do much to encourage this sort of diversified sun farming. Begin with the subsidies: payment levels should reflect the number of different crops farmers grow or the number of days of the year their fields are green – that is, taking advantage of photosynthesis, whether to grow food, replenish the soil or control erosion. If Midwestern farmers simply planted a cover crop after the fall harvest, they would significantly reduce their need for fertilizer, while cutting down on soil erosion. Why don’t farmers do this routinely? Because in recent years fossil-fuel-based fertility has been so much cheaper and easier to use than sun-based fertility.
In addition to rewarding farmers for planting cover crops, we should make it easier for them to apply compost to their fields – a practice that improves not only the fertility of the soil but also its ability to hold water and therefore withstand drought. (There is mounting evidence that it also boosts the nutritional quality of the food grown in it.) The U.S.D.A. estimates that Americans throw out 14 percent of the food they buy; much more is wasted by retailers, wholesalers and institutions. A program to make municipal composting of food and yard waste mandatory and then distributing the compost free to area farmers would shrink America’s garbage heap, cut the need for irrigation and fossil-fuel fertilizers in agriculture and improve the nutritional quality of the American diet.
Right now, most of the conservation programs run by the U.S.D.A. are designed on the zero-sum principle: land is either locked up in “conservation” or it is farmed intensively. This either-or approach reflects an outdated belief that modern farming and ranching are inherently destructive, so that the best thing for the environment is to leave land untouched. But we now know how to grow crops and graze animals in systems that will support biodiversity, soil health, clean water and carbon sequestration. The Conservation Stewardship Program, championed by Senator Tom Harkin and included in the 2008 Farm Bill, takes an important step toward rewarding these kinds of practices, but we need to move this approach from the periphery of our farm policy to the very center. Longer term, the government should back ambitious research now under way (at the Land Institute in Kansas and a handful of other places) to “perennialize” commodity agriculture: to breed varieties of wheat, rice and other staple grains that can be grown like prairie grasses – without having to till the soil every year. These perennial grains hold the promise of slashing the fossil fuel now needed to fertilize and till the soil, while protecting farmland from erosion and sequestering significant amounts of carbon.
But that is probably a 50-year project. For today’s agriculture to wean itself from fossil fuel and make optimal use of sunlight, crop plants and animals must once again be married on the farm – as in Wendell Berry’s elegant “solution.” Sunlight nourishes the grasses and grains, the plants nourish the animals, the animals then nourish the soil, which in turn nourishes the next season’s grasses and grains. Animals on pasture can also harvest their own feed and dispose of their own waste – all without our help or fossil fuel.
If this system is so sensible, you might ask, why did it succumb to Confined Animal Feeding Operations, or CAFOs? In fact there is nothing inherently efficient or economical about raising vast cities of animals in confinement. Three struts, each put into place by federal policy, support the modern CAFO, and the most important of these – the ability to buy grain for less than it costs to grow it – has just been kicked away. The second strut is F.D.A. approval for the routine use of antibiotics in feed, without which the animals in these places could not survive their crowded, filthy and miserable existence. And the third is that the government does not require CAFOs to treat their wastes as it would require human cities of comparable size to do. The F.D.A. should ban the routine use of antibiotics in livestock feed on public-health grounds, now that we have evidence that the practice is leading to the evolution of drug-resistant bacterial diseases and to outbreaks of E. coli and salmonella poisoning. CAFOs should also be regulated like the factories they are, required to clean up their waste like any other industry or municipality.
It will be argued that moving animals off feedlots and back onto farms will raise the price of meat. It probably will – as it should. You will need to make the case that paying the real cost of meat, and therefore eating less of it, is a good thing for our health, for the environment, for our dwindling reserves of fresh water and for the welfare of the animals. Meat and milk production represent the food industry’s greatest burden on the environment; a recent U.N. study estimated that the world’s livestock alone account for 18 percent of all greenhouse gases, more than all forms of transportation combined. (According to one study, a pound of feedlot beef also takes 5,000 gallons of water to produce.) And while animals living on farms will still emit their share of greenhouse gases, grazing them on grass and returning their waste to the soil will substantially offset their carbon hoof prints, as will getting ruminant animals off grain. A bushel of grain takes approximately a half gallon of oil to produce; grass can be grown with little more than sunshine.
It will be argued that sun-food agriculture will generally yield less food than fossil-fuel agriculture. This is debatable. The key question you must be prepared to answer is simply this: Can the sort of sustainable agriculture you’re proposing feed the world?
There are a couple of ways to answer this question. The simplest and most honest answer is that we don’t know, because we haven’t tried. But in the same way we now need to learn how to run an industrial economy without cheap fossil fuel, we have no choice but to find out whether sustainable agriculture can produce enough food. The fact is, during the past century, our agricultural research has been directed toward the goal of maximizing production with the help of fossil fuel. There is no reason to think that bringing the same sort of resources to the development of more complex, sun-based agricultural systems wouldn’t produce comparable yields. Today’s organic farmers, operating for the most part without benefit of public investment in research, routinely achieve 80 to 100 percent of conventional yields in grain and, in drought years, frequently exceed conventional yields. (This is because organic soils better retain moisture.) Assuming no further improvement, could the world – with a population expected to peak at 10 billion – survive on these yields?
First, bear in mind that the average yield of world agriculture today is substantially lower than that of modern sustainable farming. According to a recent University of Michigan study, merely bringing international yields up to today’s organic levels could increase the world’s food supply by 50 percent.
The second point to bear in mind is that yield isn’t everything – and growing high-yield commodities is not quite the same thing as growing food. Much of what we’re growing today is not directly eaten as food but processed into low-quality calories of fat and sugar. As the world epidemic of diet-related chronic disease has demonstrated, the sheer quantity of calories that a food system produces improves health only up to a point, but after that, quality and diversity are probably more important. We can expect that a food system that produces somewhat less food but of a higher quality will produce healthier populations.
The final point to consider is that 40 percent of the world’s grain output today is fed to animals; 11 percent of the world’s corn and soybean crop is fed to cars and trucks, in the form of biofuels. Provided the developed world can cut its consumption of grain-based animal protein and ethanol, there should be plenty of food for everyone – however we choose to grow it.
In fact, well-designed polyculture systems, incorporating not just grains but vegetables and animals, can produce more food per acre than conventional monocultures, and food of a much higher nutritional value. But this kind of farming is complicated and needs many more hands on the land to make it work. Farming without fossil fuels – performing complex rotations of plants and animals and managing pests without petrochemicals – is labor intensive and takes more skill than merely “driving and spraying,” which is how corn-belt farmers describe what they do for a living.
To grow sufficient amounts of food using sunlight will require more people growing food – millions more. This suggests that sustainable agriculture will be easier to implement in the developing world, where large rural populations remain, than in the West, where they don’t. But what about here in America, where we have only about two million farmers left to feed a population of 300 million? And where farmland is being lost to development at the rate of 2,880 acres a day? Post-oil agriculture will need a lot more people engaged in food production – as farmers and probably also as gardeners.
The sun-food agenda must include programs to train a new generation of farmers and then help put them on the land. The average American farmer today is 55 years old; we shouldn’t expect these farmers to embrace the sort of complex ecological approach to agriculture that is called for. Our focus should be on teaching ecological farming systems to students entering land-grant colleges today. For decades now, it has been federal policy to shrink the number of farmers in America by promoting capital-intensive monoculture and consolidation. As a society, we devalued farming as an occupation and encouraged the best students to leave the farm for “better” jobs in the city. We emptied America’s rural counties in order to supply workers to urban factories. To put it bluntly, we now need to reverse course. We need more highly skilled small farmers in more places all across America – not as a matter of nostalgia for the agrarian past but as a matter of national security. For nations that lose the ability to substantially feed themselves will find themselves as gravely compromised in their international dealings as nations that depend on foreign sources of oil presently do. But while there are alternatives to oil, there are no alternatives to food.
National security also argues for preserving every acre of farmland we can and then making it available to new farmers. We simply will not be able to depend on distant sources of food, and therefore need to preserve every acre of good farmland within a day’s drive of our cities. In the same way that when we came to recognize the supreme ecological value of wetlands we erected high bars to their development, we need to recognize the value of farmland to our national security and require real-estate developers to do “food-system impact statements” before development begins. We should also create tax and zoning incentives for developers to incorporate farmland (as they now do “open space”) in their subdivision plans; all those subdivisions now ringing golf courses could someday have diversified farms at their center.
The revival of farming in America, which of course draws on the abiding cultural power of our agrarian heritage, will pay many political and economic dividends. It will lead to robust economic renewal in the countryside. And it will generate tens of millions of new “green jobs,” which is precisely how we need to begin thinking of skilled solar farming: as a vital sector of the 21st-century post-fossil-fuel economy.
II. Reregionalizing the Food System
For your sun-food agenda to succeed, it will have to do a lot more than alter what happens on the farm. The government could help seed a thousand new polyculture farmers in every county in Iowa, but they would promptly fail if the grain elevator remained the only buyer in town and corn and beans were the only crops it would take. Resolarizing the food system means building the infrastructure for a regional food economy – one that can support diversified farming and, by shortening the food chain, reduce the amount of fossil fuel in the American diet.
A decentralized food system offers a great many other benefits as well. Food eaten closer to where it is grown will be fresher and require less processing, making it more nutritious. Whatever may be lost in efficiency by localizing food production is gained in resilience: regional food systems can better withstand all kinds of shocks. When a single factory is grinding 20 million hamburger patties in a week or washing 25 million servings of salad, a single terrorist armed with a canister of toxins can, at a stroke, poison millions. Such a system is equally susceptible to accidental contamination: the bigger and more global the trade in food, the more vulnerable the system is to catastrophe. The best way to protect our food system against such threats is obvious: decentralize it.
Today in America there is soaring demand for local and regional food; farmers’ markets, of which the U.S.D.A. estimates there are now 4,700, have become one of the fastest-growing segments of the food market. Community-supported agriculture is booming as well: there are now nearly 1,500 community-supported farms, to which consumers pay an annual fee in exchange for a weekly box of produce through the season. The local-food movement will continue to grow with no help from the government, especially as high fuel prices make distant and out-of-season food, as well as feedlot meat, more expensive. Yet there are several steps the government can take to nurture this market and make local foods more affordable. Here are a few:
Four-Season Farmers’ Markets. Provide grants to towns and cities to build year-round indoor farmers’ markets, on the model of Pike Place in Seattle or the Reading Terminal Market in Philadelphia. To supply these markets, the U.S.D.A. should make grants to rebuild local distribution networks in order to minimize the amount of energy used to move produce within local food sheds.
Agricultural Enterprise Zones. Today the revival of local food economies is being hobbled by a tangle of regulations originally designed to check abuses by the very largest food producers. Farmers should be able to smoke a ham and sell it to their neighbors without making a huge investment in federally approved facilities. Food-safety regulations must be made sensitive to scale and marketplace, so that a small producer selling direct off the farm or at a farmers’ market is not regulated as onerously as a multinational food manufacturer. This is not because local food won’t ever have food-safety problems – it will – only that its problems will be less catastrophic and easier to manage because local food is inherently more traceable and accountable.
Local Meat-Inspection Corps. Perhaps the single greatest impediment to the return of livestock to the land and the revival of local, grass-based meat production is the disappearance of regional slaughter facilities. The big meat processors have been buying up local abattoirs only to close them down as they consolidate, and the U.S.D.A. does little to support the ones that remain. From the department’s perspective, it is a better use of shrinking resources to dispatch its inspectors to a plant slaughtering 400 head an hour than to a regional abattoir slaughtering a dozen. The U.S.D.A. should establish a Local Meat-Inspectors Corps to serve these processors. Expanding on its successful pilot program on Lopez Island in Puget Sound, the U.S.D.A. should also introduce a fleet of mobile abattoirs that would go from farm to farm, processing animals humanely and inexpensively. Nothing would do more to make regional, grass-fed meat fully competitive in the market with feedlot meat.
Establish a Strategic Grain Reserve. In the same way the shift to alternative energy depends on keeping oil prices relatively stable, the sun-food agenda – as well as the food security of billions of people around the world – will benefit from government action to prevent huge swings in commodity prices. A strategic grain reserve, modeled on the Strategic Petroleum Reserve, would help achieve this objective and at the same time provide some cushion for world food stocks, which today stand at perilously low levels. Governments should buy and store grain when it is cheap and sell when it is dear, thereby moderating price swings in both directions and discouraging speculation.
Regionalize Federal Food Procurement. In the same way that federal procurement is often used to advance important social goals (like promoting minority-owned businesses), we should require that some minimum percentage of government food purchases – whether for school-lunch programs, military bases or federal prisons – go to producers located within 100 miles of institutions buying the food. We should create incentives for hospitals and universities receiving federal funds to buy fresh local produce. To channel even a small portion of institutional food purchasing to local food would vastly expand regional agriculture and improve the diet of the millions of people these institutions feed.
Create a Federal Definition of “Food.” It makes no sense for government food-assistance dollars, intended to improve the nutritional health of at-risk Americans, to support the consumption of products we know to be unhealthful. Yes, some people will object that for the government to specify what food stamps can and cannot buy smacks of paternalism. Yet we already prohibit the purchase of tobacco and alcohol with food stamps. So why not prohibit something like soda, which is arguably less nutritious than red wine? Because it is, nominally, a food, albeit a “junk food.” We need to stop flattering nutritionally worthless foodlike substances by calling them “junk food” – and instead make clear that such products are not in fact food of any kind. Defining what constitutes real food worthy of federal support will no doubt be controversial (you’ll recall President Reagan’s ketchup imbroglio), but defining food upward may be more politically palatable than defining it down, as Reagan sought to do. One approach would be to rule that, in order to be regarded as a food by the government, an edible substance must contain a certain minimum ratio of micronutrients per calorie of energy. At a stroke, such a definition would improve the quality of school lunch and discourage sales of unhealthful products, since typically only “food” is exempt from local sales tax.
A few other ideas: Food-stamp debit cards should double in value whenever swiped at a farmers’ markets – all of which, by the way, need to be equipped with the Electronic Benefit Transfer card readers that supermarkets already have. We should expand the WIC program that gives farmers’-market vouchers to low-income women with children; such programs help attract farmers’ markets to urban neighborhoods where access to fresh produce is often nonexistent. (We should also offer tax incentives to grocery chains willing to build supermarkets in underserved neighborhoods.) Federal food assistance for the elderly should build on a successful program pioneered by the state of Maine that buys low-income seniors a membership in a community-supported farm. All these initiatives have the virtue of advancing two objectives at once: supporting the health of at-risk Americans and the revival of local food economies.
III. Rebuilding America’s Food Culture
In the end, shifting the American diet from a foundation of imported fossil fuel to local sunshine will require changes in our daily lives, which by now are deeply implicated in the economy and culture of fast, cheap and easy food. Making available more healthful and more sustainable food does not guarantee it will be eaten, much less appreciated or enjoyed. We need to use all the tools at our disposal – not just federal policy and public education but the president’s bully pulpit and the example of the first family’s own dinner table – to promote a new culture of food that can undergird your sun-food agenda.
Changing the food culture must begin with our children, and it must begin in the schools. Nearly a half-century ago, President Kennedy announced a national initiative to improve the physical fitness of American children. He did it by elevating the importance of physical education, pressing states to make it a requirement in public schools. We need to bring the same commitment to “edible education” – in Alice Waters’s phrase – by making lunch, in all its dimensions, a mandatory part of the curriculum. On the premise that eating well is a critically important life skill, we need to teach all primary-school students the basics of growing and cooking food and then enjoying it at shared meals.
To change our children’s food culture, we’ll need to plant gardens in every primary school, build fully equipped kitchens, train a new generation of lunchroom ladies (and gentlemen) who can once again cook and teach cooking to children. We should introduce a School Lunch Corps program that forgives federal student loans to culinary-school graduates in exchange for two years of service in the public-school lunch program. And we should immediately increase school-lunch spending per pupil by $1 a day – the minimum amount food-service experts believe it will take to underwrite a shift from fast food in the cafeteria to real food freshly prepared.
But it is not only our children who stand to benefit from public education about food. Today most federal messages about food, from nutrition labeling to the food pyramid, are negotiated with the food industry. The surgeon general should take over from the Department of Agriculture the job of communicating with Americans about their diet. That way we might begin to construct a less equivocal and more effective public-health message about nutrition. Indeed, there is no reason that public-health campaigns about the dangers of obesity and Type 2 diabetes shouldn’t be as tough and as effective as public-health campaigns about the dangers of smoking. The Centers for Disease Control estimates that one in three American children born in 2000 will develop Type 2 diabetes. The public needs to know and see precisely what that sentence means: blindness; amputation; early death. All of which can be avoided by a change in diet and lifestyle. A public-health crisis of this magnitude calls for a blunt public-health message, even at the expense of offending the food industry. Judging by the success of recent antismoking campaigns, the savings to the health care system could be substantial.
There are other kinds of information about food that the government can supply or demand. In general we should push for as much transparency in the food system as possible – the other sense in which “sunlight” should be the watchword of our agenda. The F.D.A. should require that every packaged-food product include a second calorie count, indicating how many calories of fossil fuel went into its production. Oil is one of the most important ingredients in our food, and people ought to know just how much of it they’re eating. The government should also throw its support behind putting a second bar code on all food products that, when scanned either in the store or at home (or with a cellphone), brings up on a screen the whole story and pictures of how that product was produced: in the case of crops, images of the farm and lists of agrochemicals used in its production; in the case of meat and dairy, descriptions of the animals’ diet and drug regimen, as well as live video feeds of the CAFO where they live and, yes, the slaughterhouse where they die. The very length and complexity of the modern food chain breeds a culture of ignorance and indifference among eaters. Shortening the food chain is one way to create more conscious consumers, but deploying technology to pierce the veil is another.
Finally, there is the power of the example you set in the White House. If what’s needed is a change of culture in America’s thinking about food, then how America’s first household organizes its eating will set the national tone, focusing the light of public attention on the issue and communicating a simple set of values that can guide Americans toward sun-based foods and away from eating oil.
The choice of White House chef is always closely watched, and you would be wise to appoint a figure who is identified with the food movement and committed to cooking simply from fresh local ingredients. Besides feeding you and your family exceptionally well, such a chef would demonstrate how it is possible even in Washington to eat locally for much of the year, and that good food needn’t be fussy or complicated but does depend on good farming. You should make a point of the fact that every night you’re in town, you join your family for dinner in the Executive Residence – at a table. (Surely you remember the Reagans’ TV trays.) And you should also let it be known that the White House observes one meatless day a week – a step that, if all Americans followed suit, would be the equivalent, in carbon saved, of taking 20 million midsize sedans off the road for a year. Let the White House chef post daily menus on the Web, listing the farmers who supplied the food, as well as recipes.
Since enhancing the prestige of farming as an occupation is critical to developing the sun-based regional agriculture we need, the White House should appoint, in addition to a White House chef, a White House farmer. This new post would be charged with implementing what could turn out to be your most symbolically resonant step in building a new American food culture. And that is this: tear out five prime south-facing acres of the White House lawn and plant in their place an organic fruit and vegetable garden.
When Eleanor Roosevelt did something similar in 1943, she helped start a Victory Garden movement that ended up making a substantial contribution to feeding the nation in wartime. (Less well known is the fact that Roosevelt planted this garden over the objections of the U.S.D.A., which feared home gardening would hurt the American food industry.) By the end of the war, more than 20 million home gardens were supplying 40 percent of the produce consumed in America. The president should throw his support behind a new Victory Garden movement, this one seeking “victory” over three critical challenges we face today: high food prices, poor diets and a sedentary population. Eating from this, the shortest food chain of all, offers anyone with a patch of land a way to reduce their fossil-fuel consumption and help fight climate change. (We should offer grants to cities to build allotment gardens for people without access to land.) Just as important, Victory Gardens offer a way to enlist
Americans, in body as well as mind, in the work of feeding themselves and changing the food system – something more ennobling, surely, than merely asking them to shop a little differently.
I don’t need to tell you that ripping out even a section of the White House lawn will be controversial: Americans love their lawns, and the South Lawn is one of the most beautiful in the country. But imagine all the energy, water and petrochemicals it takes to make it that way. (Even for the purposes of this memo, the White House would not disclose its lawn-care regimen.) Yet as deeply as Americans feel about their lawns, the agrarian ideal runs deeper still, and making this particular plot of American land productive, especially if the First Family gets out there and pulls weeds now and again, will provide an image even more stirring than that of a pretty lawn: the image of stewardship of the land, of self-reliance and of making the most of local sunlight to feed one’s family and community. The fact that surplus produce from the South Lawn Victory Garden (and there will be literally tons of it) will be offered to regional food banks will make its own eloquent statement.
You’re probably thinking that growing and eating organic food in the White House carries a certain political risk. It is true you might want to plant iceberg lettuce rather than arugula, at least to start. (Or simply call arugula by its proper American name, as generations of Midwesterners have done: “rocket.”) But it should not be difficult to deflect the charge of elitism sometimes leveled at the sustainable-food movement. Reforming the food system is not inherently a right-or-left issue: for every Whole Foods shopper with roots in the counterculture you can find a family of evangelicals intent on taking control of its family dinner and diet back from the fast-food industry – the culinary equivalent of home schooling. You should support hunting as a particularly sustainable way to eat meat – meat grown without any fossil fuels whatsoever. There is also a strong libertarian component to the sun-food agenda, which seeks to free small producers from the burden of government regulation in order to stoke rural innovation. And what is a higher “family value,” after all, than making time to sit down every night to a shared meal?
Our agenda puts the interests of America’s farmers, families and communities ahead of the fast-food industry’s. For that industry and its apologists to imply that it is somehow more “populist” or egalitarian to hand our food dollars to Burger King or General Mills than to support a struggling local farmer is absurd. Yes, sun food costs more, but the reasons why it does only undercut the charge of elitism: cheap food is only cheap because of government handouts and regulatory indulgence (both of which we will end), not to mention the exploitation of workers, animals and the environment on which its putative “economies” depend. Cheap food is food dishonestly priced – it is in fact unconscionably expensive.
Your sun-food agenda promises to win support across the aisle. It builds on America’s agrarian past, but turns it toward a more sustainable, sophisticated future. It honors the work of American farmers and enlists them in three of the 21st century’s most urgent errands: to move into the post-oil era, to improve the health of the American people and to mitigate climate change. Indeed, it enlists all of us in this great cause by turning food consumers into part-time producers, reconnecting the American people with the American land and demonstrating that we need not choose between the welfare of our families and the health of the environment – that eating less oil and more sunlight will redound to the benefit of both.
——-
Michael Pollan, a contributing writer for the magazine, is the Knight Professor of Journalism at the University of California, Berkeley. He is the author, most recently, of “In Defense of Food: An Eater’s Manifesto.”