The Future: six drivers of global change

Chapter 2. For the same reasons that this rich data is potentially so useful in improving the efficacy of health care and reducing medical costs, it is also seen as highly valuable to insurance companies and employers who are often eager to sever their relationships with customers and employees who represent high risks for big medical bills. Already, a high percentage of those who could benefit from genetic testing are refusing to have the information gathered for fear that they will lose their jobs and/or their health insurance.

"MODIFY THE KID"

As implants, prosthetics, neuroprosthetics, and other applications in cybernetics continue to improve, the discussion about their implications has broadened from their use as therapeutic, remedial, and reparative devices to include the implications of using prosthetics that enhance humans. For example, the brain implants described above that can help stroke victims learn more quickly how to walk again, can also be used in healthy people to enhance concentration at times of their choosing to help them learn a brand-new skill, or enhance their capacity for focus when they feel it is particularly important.

The temporary enhancement of mental performance through the use of pharmaceuticals has already begun, with an estimated 4 percent of college students routinely using attention-focusing medications like Adderall, Ritalin, and Provigil to improve their test scores on exams. Studies at some schools found rates as high as 35 percent. After an in-depth investigation of the use of these drugs in high schools, The New York Times reported that there was "no reliable research" on which to base a national estimate, but that a survey of more than fifteen schools with high academic standards yielded an estimate from doctors and students that the percentage of students using these substances "ranges from 15 percent to 40 percent."

The Times went on to report, "One consensus was clear: users were becoming more common ... and some students who would rather not take the drugs would be compelled to join them because of the compet.i.tion over cla.s.s rank and colleges" interest." Some doctors who work with low-income families have started prescribing Adderall for children to help them compensate for the advantages that children from wealthy families have. One of them, Dr. Michael Anderson, of Canton, Georgia, told the Times he thinks of it as "evening the scales a little bit.... We"ve decided as a society that it"s too expensive to modify the kid"s environment. So we have to modify the kid."

A few years ago, almost 1,500 people working as research scientists at inst.i.tutions in more than sixty countries responded to a survey on the use of brain-enhancing pharmaceuticals. Approximately 20 percent said that they had indeed used such drugs, with the majority saying they felt they improved their memory and ability to focus. Although inappropriate use and dangerous overuse of these substances has caused doctors to warn about risks and side effects, scientists are working on new compounds that carry the promise of actually boosting intelligence. Some predict that the use of the improved intelligence-enhancement drugs now under development may well become commonplace and carry as little stigma as cosmetic surgery does today. The U.S. Defense Advanced Research Projects Agency is experimenting with a different approach to enhance concentration and speed the learning of new skills, by using small electrical currents applied from outside the skull to the part of the brain used for object recognition in order to improve the training of snipers.

ENHANCING PERFORMANCE.

At the 2012 Olympics, South Africa"s Oscar Pistorius made history as the first double amputee track athlete ever to compete. Pistorius, who was born with no fibulas in his lower legs, both of which were amputated before he was one year old, learned to run on prosthetics. He competed in the 400-meter sprint, where he reached the semifinals, and the 4 400 relay, in which the South African team reached the finals.

Some of Pistorius"s compet.i.tors expressed concern before the games that the flexible blades attached to his prosthetic lower legs actually gave him an unfair advantage. The retired world record holder in the 400-meter sprint, Michael Johnson, said, "Because we don"t know for sure whether he gets an advantage from the prosthetics, it is unfair to the able-bodied compet.i.tors."

Because of his courage and determination, most were cheering for Pistorius to win. Still, it"s clear that we are already in a time of ethical debate over whether artificial enhancements of human beings lead to unfair advantages of various kinds. When Pistorius competed two weeks later in the Paralympics, he himself lodged a protest against one of the other runners whose prosthetic blades, according to Pistorius, were too long compared to his height and gave him an unfair advantage.

In another example from athletics, the use of a hormone called erythropoietin (EPO)-which regulates the production of red blood cells-can give athletes a significant advantage by delivering more oxygen to the muscles for a longer period of time. One former winner of the Tour de France has already been stripped of his victory after he tested positive for elevated testosterone. He has admitted use of EPO, along with other illegal enhancements. More recently, seven-time Tour de France winner Lance Armstrong was stripped of his championships and banned from cycling for life after the U.S. Anti-Doping Agency released a report detailing his use of EPO, steroids, and blood transfusions, doping by other members of his team, and a complex cover-up scheme.

The authorities in charge of the Olympics and other athletic compet.i.tions have been forced into a genetic and biochemical arms race to develop ever more sophisticated methods of detecting new enhancements that violate the rules. What if the gene that produces extra EPO is spliced into an athlete"s genome? How will that be detected?

At least one former Olympic multiple gold medal winner, Eero Mantyranta, the Finnish cross-country skier, was found years later to have a natural mutation that caused his body to produce more than the average EPO-and thus produce more red blood cells. Clearly, that cannot be considered a violation of Olympic rules. Mantyranta competed in the 1960s, before the gene splicing technique was available. But if future Olympians show up with the same mutation, it may be impossible to determine whether it is natural or has been artificially spliced into their genomes. The splicing could be detected now, but scientists say that when the procedure is perfected, Olympic officials may not be able to make a ruling without genetic testing of the athlete"s relatives.

In another example, scientists have now discovered ways to manipulate a protein called myostatin that regulates the building of muscles. Animals in which myostatin is blocked develop unnaturally large and strong muscles throughout their bodies. If athletes are genetically engineered to enhance their muscle development, does that const.i.tute unfair compet.i.tion? Isn"t that simply a new form of doping, like the use of steroids and oxygen-rich blood injections? Yet here again, some people-including at least one young aspiring gymnast-have a rare but natural mutation that prevents them from producing a normal volume of myostatin, and results in supernormal musculature.

The convergence of genetic engineering and prosthetics is also likely to produce new breakthroughs. Scientists in California announced a new project in 2012 to create an artificial t.e.s.t.i.c.l.e, which they refer to as a human "sperm-making biological machine." Essentially a prosthesis, the artificial t.e.s.t.i.c.l.e would be injected every two months with sperm cells engineered from the man"s own adult stem cells.

Some of the earliest applications of genetic research have been in the treatment of infertility. In fact, a great deal of the work since the beginning of the Life Sciences Revolution has focused on the beginning and the end of the human lifecycle-the reinvention of life and death.

THE CHANGING ETHICS OF FERTILITY.

The birth in England of the first so-called test tube baby in 1978, Louise Brown, caused a global debate about the ethics and propriety of the procedure-a debate that in many ways established a template for the way publics react to most such breakthroughs. In the first stage, there is a measure of shock and awe, mingled with an anxious flurry of speculation as newly minted experts try to explore the implications of the breakthrough. Some bioethicists worried at the time that in vitro fertilization might somehow diminish parental love and weaken generational ties. But set against the unfocused angst and furrowed brows is the overflowing joy of the new parents whose dreams of a child have at last been realized. Soon thereafter, the furor dies down and fades away. As one U.S. bioethicist, Debra Mathews, put it, "People want children and no one wants anyone else to tell them they can"t have them." Since 1978, more than five million children have been born to infertile people wanting children through the use of in vitro fertilization and related procedures.

During numerous congressional hearings on advances in life sciences research in the 1970s and 1980s, I saw this pattern repeated many times. Even earlier, in 1967, the first heart transplant by Dr. Christiaan Barnard in South Africa also caused controversy, but the joy and wonder of what was seen as a medical miracle put an end to the debate before it gained momentum. A doctor a.s.sisting in the operation, Dr. Warwick Peac.o.c.k, told me that when the transplanted heart finally began to beat, Barnard exclaimed, "My G.o.d, it"s working!" Later on, the first cloning of livestock and the commercialization of surrogate motherhood also caused controversies with very short half-lives.

Now, however, the torrent of scientific breakthroughs is leading to new fertility options that may generate controversies that don"t fade as quickly. One new procedure involves the conception of an embryo and the use of preimplantation genetic diagnosis (PGD) to select a suitable "savior sibling" who can serve as an organ, tissue, bone marrow, or umbilical cord stem cell donor for his or her sibling. Some bioethicists have raised concerns that the instrumental purpose of such conceptions devalues the child, though others ask why this must necessarily be the case. In theory, the parents can love and value both children equally even as they pursue a medically important cure for the first with the a.s.sistance of the second. Whether truly informed consent on the part of the donor child is plausible in this scenario is another matter.

Scientists and doctors at the Department of Reproductive Medicine at Newcastle University in England outlined a procedure for creating "three-parent babies," to allow couples at high risk of pa.s.sing on to their children an incurable genetic illness pa.s.sed from their mother"s faulty mitochondrial DNA to have a healthy child. If a third person, who does not have the genes in question, allows her genes (it must come from a female donor) to be subst.i.tuted for that portion of the embryo"s genome, then the baby will escape the feared genetic condition. Ninety-eight percent of the baby"s DNA would come from the mother and father; only 2 percent or so would come from the gene donor. However, this genetic modification is one that will affect not only the baby, but all of its offspring, in perpetuity. As a result, the doctors have asked for a government review of the procedure to determine whether the procedure is acceptable under Britain"s laws.

When choices such as these are in the hands of parents rather than the government, most people adopt a different standard for deciding how they feel about the procedure in question. The great exception is the continuing debate over the ethics of abortion. In spite of the pa.s.sionate opposition to abortion among many thoughtful people, the majority in most countries seem to override whatever degree of uneasiness they have about the procedure by affirming the principle that it is a decision that should properly be left to the pregnant woman herself, at least in the earlier stages of the pregnancy.

Nevertheless, the dispersal of new genetic options to individuals is, in some countries, leading to new laws regulating what parents can and cannot do. India has outlawed genetic testing of embryos, or even blood tests, that are designed to identify the gender of the embryo. The strong preference by many Indian parents that their next child be male, particularly if they already have a daughter, has already led to the abortion of 500,000 female fetuses each year and a growing imbalance of the male to female s.e.x ratio in the population. (Among the many cultural factors that have long been at work in producing the preference for baby boys is the high cost of the dowry that must be paid by the parents of a bride.) The 2011 provisional census in India, which showed a further steep decline in the child s.e.x ratio, led the Indian government to launch a new campaign to better enforce the prohibition against the s.e.x selection of children.

Most of the prenatal gender identification procedures in India utilize ultrasound machines rather than riskier procedures such as amniocentesis, and the prevalence of advertising for low-cost ultrasound clinics is a testament to the popularity of the procedure. Although s.e.x-selective abortions are illegal in India, proposed bans on ultrasound machines have not gained support, in part because of their other medical uses. Some couples from India-and other countries-are now traveling to Thailand, where the successful "medical tourism" industry is offering preimplantation genetic diagnosis procedures to couples intent on having a baby boy. A doctor at one of these clinics said that he has never had a request for a female embryo.

Now a scientific breakthrough allows the testing of fetal DNA in blood samples taken from pregnant mothers; experts say the test is 95 percent accurate in determining gender seven weeks into the pregnancy, and becomes even more accurate as the pregnancy proceeds. One company making test kits, Consumer Genetics Inc., of Santa Clara, California, requires women to sign an agreement not to use the test results for s.e.x selection; the company has also announced that it will not sell the kits in India or China.

In 2012, researchers at the University of Washington announced a breakthrough in the sequencing of almost the entire genome of a fetus from the combination of a blood sample from the pregnant woman and a saliva sample from the father. Although the process is still expensive (an estimated $20,000 to $50,000 for one fetal genome-last year, the cost was $200,000 per test), the cost is likely to continue falling very quickly. Soon after this breakthrough was announced, a medical research team at Stanford announced an improved procedure that does not require a genetic sample from the father and is expected to be widely available within two years for an estimated $3,000.

While so much attention has been focused on the gender screening of embryos, tremendous progress has been made on the screening for genetic markers that identify serious disorders that might be treated through early detection. Of the roughly four million babies born in the United States each year, for example, approximately 5,000 have genetic or functional disorders amenable to treatment if discovered early. Since newborn babies are routinely screened on the day of their birth for more than twenty diseases, the new ease with which genetic screening can be done on embryos is, in one sense, just an extension of the process already performed routinely immediately after birth.

The ethical implications are quite different, however, because of the possibility that knowledge of some condition or trait in the embryo could lead the parents to perform an abortion. Indeed, the termination of pregnancies involving fetuses with serious genetic defects is common around the world. A recent U.S. study, for example, found that more than 90 percent of American women who find that the fetus they are carrying has Down syndrome are terminating their pregnancies. The author of an article provocatively t.i.tled "The Future of Neo-Eugenics," Armand Leroi at Imperial College in the U.K., wrote, "The widespread acceptance of abortion as a eugenic practice suggests that there might be little resistance to more sophisticated methods of eugenic selection and, in general, this has been the case."

Scientists say that within this decade, they expect to develop the ability to screen embryos for such traits as hair and eye color, skin complexion, and a variety of other traits-including some that have been previously thought of as behavioral but which some scientists now believe have heavy genetic components. Dr. David Eagleman, a neuroscientist at the Baylor College of Medicine, notes, "If you are a carrier of a particular set of genes, the probability that you will commit a violent crime is four times as high as it would be if you lacked those genes.... The overwhelming majority of prisoners carry these genes; 98.1 percent of death-row inmates do."

If prospective parents found that set of genes in the embryo they were considering for implantation, would they be tempted to splice them out, or select a different embryo instead? Will we soon be debating "distributed eugenics"? As a result of these and similar developments, some bioethicists are expressing concern that what Leroi called "neo-eugenics" will soon confront us with yet another round of difficult ethical choices.

Already, in vitro fertilization clinics are now using preimplantation genetic diagnosis (PGD) to scan embryos for markers a.s.sociated with hundreds of diseases before implantation. Although the United States has more regulations in the field of medical research than most countries, PGD is still completely unregulated. Consequently, it may be only a matter of time before a much wider range of criteria-including cosmetic or aesthetic factors-are presented as options for parents to select in the screening process.

One question that has already arisen is the ethics of disposing of embryos that are not selected for implantation. If they are screened out as candidates, they can be frozen and preserved for potential later implantation-and that is an option chosen by many women who undergo the in vitro fertilization procedure. However, often several embryos are implanted simultaneously in order to improve the odds that one will survive; that is the princ.i.p.al reason why multiple births are far more common with in vitro fertilization than in the general population.

The United Kingdom has set a legal limit on the number of embryos that doctors can implant, in order to decrease the number of multiple births and avoid the a.s.sociated complications for the mothers and babies-and the additional cost to the health care system. As a result, one company, Auxogyn, is using digital imaging (in conjunction with a sophisticated algorithm), in order to monitor the developing embryos every five minutes-from the moment they are fertilized until one of them is selected for implantation. The purpose is to select the embryo that is most likely to develop in a healthy way.

As a practical matter, most realize that it is only a matter of time before the vast majority of frozen embryos are discarded-which raises the same underlying issue that motivates the movement to stop abortions: is an embryo in the earliest stages of life ent.i.tled to all of the legal protections available to individuals after they are born? Again, regardless of misgivings they may have, the majority in almost every country have reached the conclusion that even though embryos mark the first stage of human life, the practical differences between an embryo, or fetus, and an individual are nevertheless significant enough to allow the pregnant woman to control the choice on abortion. That view is consistent with a parallel view of the majority in almost every country that the government does not have the right to require a pregnant woman to have an abortion.

The furor over embryonic stem cell research grows out of a related issue. Even if it is judged appropriate for women to have the option of terminating their pregnancies-under most circ.u.mstances-is it also acceptable for the parents to give permission for "experimentation" on the embryo to which they have given the beginning of a life? Although this controversy is far from resolved, the majority of people in most countries apparently feel that the scientific and medical benefits of withdrawing stem cells from embryos are so significant that they justify such experiments. In many countries, the justification is linked to a prior determination that the embryos in question are due to be discarded in any case.

The discovery of nonembryonic stem cells (induced pluripotent, or iPS cells) by Shinya Yamanaka at Kyoto University (who was awarded the 2012 n.o.bel Prize in Medicine) is creating tremendous excitement about a wide range of new therapies and dramatic improvements in drug discovery and screening. In spite of this exciting discovery, however, many scientists still argue that embryonic stem cells may yet prove to have unique qualities and potential that justify their continued use. Researchers at University College London have already used stem cells to successfully restore some vision to mice with an inherited retinal disease, and believe that some forms of blindness in humans may soon be treatable with similar techniques. Other researchers at the University of Sheffield have used stem cells to rebuild nerves in the ears of gerbils and restore their hearing.

In 2011, j.a.panese fertility scientists at Kyoto University caused a stir when they announced that they had successfully used embryonic mouse stem cells to produce sperm when transplanted into the t.e.s.t.i.c.l.es of mice that were infertile. When the sperm was then extracted and put into mouse eggs, the fertilized eggs were transferred to the uteri of female mice, and resulted in normal offspring that could then reproduce naturally. Their work builds on an English science breakthrough in 2006 in which biologists at the University of Newcastle upon Tyne first produced functioning sperm cells that had been converted from stem cells and produced live offspring, though the offspring had genetic defects.

One reason why these studies drew such attention was that the same basic technique, as it is developed and perfected, may soon make it possible for infertile men to have biological children-and opens the possibility for gay and lesbian couples to have children that are genetically and biologically their own. Some headline writers also savored the speculation that since there is no reason why women cannot, in theory, produce their own sperm cells using this technique: "Will Men Become Obsolete?" On the lengthening list of potentially disquieting outcomes from genetic research, this possibility appears destined to linger near the bottom, though I am certainly biased in making that prediction.

LIFESPANS AND "HEALTHSPANS"

Just as scientists working on fertility have focused on the beginning of life, others have been focused on the end of life-and have been making dramatic progress in understanding the factors affecting longevity. They are developing new strategies, which they hope will achieve not only significant extensions in the average human lifespan, but also the extension of what many refer to as the "healthspan"-the number of years we live a healthy life without debilitating conditions or diseases.

Although a few scientific outliers have argued that genetic engineering could increase human lifespans by multiple centuries, the consensus among many aging specialists is that an increase of up to 25 percent is more likely to be the range of what is possible. According to most experts, evolutionary theory and numerous studies in human and animal genetics lead them to the conclusion that environmental and lifestyle factors contribute roughly three quarters to the aging process and that genetics makes a more modest contribution-somewhere between 20 and 30 percent.

One of the most famous studies of the relationship between lifestyle and longevity showed that extreme caloric restriction extends the lives of rodents dramatically, although there is debate about whether this lifestyle adjustment has the same effect on longevity in humans. More recent studies have shown that rhesus monkeys do not live longer with severe caloric restrictions. There is a subtle but important distinction, experts on all sides point out, between longevity and aging. Although they are obviously related, longevity measures the length of life, whereas aging is the process by which cell damage contributes over time to conditions that bring the end of life.

Some highly questionable therapies, such as the use of human growth hormone in an effort to slow or reverse unwanted manifestations of the aging process, may well have side effects that shorten longevity, such as triggering the onset of diabetes and the growth of tumors. Other hormones that have been used to combat symptoms of aging-most prominently, testosterone and estrogen-have also led to controversies about side effects that can shorten longevity for certain patients.

However, excitement was also stirred by a Harvard study in 2010 that showed that the aging process in mice could be halted and even reversed by the use of enzymes known as telomerases, which serve to protect the telomeres-or protective caps-on the ends of chromosomes in order to prevent them from damage. Scientists have long known that these telomeres get shorter with the aging of cells and that this shortening process can ultimately halt the renewal of the cells through replication. As a result of the Harvard study, scientists are exploring strategies for protecting the telomeres in order to r.e.t.a.r.d the aging process.

Some researchers are optimistic that extensive whole genome studies of humans with very long lifespans may yet lead to the discovery of genetic factors that can be used to extend longevity in others. However, most of the dramatic extensions in the average human lifespan over the last century have come from improvements in sanitation and nutrition, and from medical breakthroughs such as the discovery of antibiotics and the development of vaccines. Further improvements in these highly successful strategies are likely to further improve average lifespans-probably, scientists speculate, at the rate of improvement we have become used to-about one extra year per decade.

In addition, the continued global efforts to fight infectious disease threats are also extending average lifespans by reducing the number of premature deaths. Much of this work is now focused on malaria, tuberculosis, HIV/AIDS, influenza, viral pneumonia, and multiple so-called "neglected tropical diseases" that are barely known in the industrialized world but afflict more than a billion people in developing tropical and subtropical countries.

THE DISEASE FRONT.

There has been heartening progress in reducing the number of people who die of AIDS each year. In 2012, the number fell to 1.7 million, significantly down from its 2005 peak of 2.3 million. The princ.i.p.al reason for this progress is greater access to pharmaceuticals-particularly antiretroviral drugs-that extend the lifespan and improve the health of people who have the disease. Efforts to reduce the infection rate continue to be focused on preventive education, the distribution of condoms in high-risk areas, and accelerated efforts to develop a vaccine.

Malaria has also been reduced significantly over the past decade with a carefully chosen combination of strategies. Although the largest absolute declines were in Africa, according to the U.N., 90 percent of all malaria deaths still take place in Sub-Saharan Africa-most of them involving children under five. Although an ambitious effort in the 1950s to eradicate malaria did not succeed, a few of those working hard to eradicate malaria, including Bill Gates, now believe that their goal may actually be realistic within the next few decades.

The world did succeed in eliminating the terrible scourge of smallpox in 1980. And in 2011 the U.N. Food and Agriculture Organization succeeded in eliminating a second disease, rinderpest, a relative of measles that killed cattle and other animals with cloven hooves. Because it was an animal disease, rinderpest never garnered the global attention that smallpox commanded, but it was one of the deadliest and most feared threats to those whose families and communities depend on livestock.

For all of the appropriate attention being paid to infectious diseases, the leading causes of death in the world today, according to the World Health Organization, are chronic diseases that are not communicable. In the last year for which statistics are available, 2008, approximately 57 million people died in the world, and almost 60 percent of those deaths were caused by chronic diseases, princ.i.p.ally cardiovascular disease, diabetes, cancer, and chronic respiratory diseases.

Cancer is a special challenge, in part because it is not one disease, but many. The U.S. National Cancer Inst.i.tute and the National Human Genome Research Inst.i.tute have been spending $100 million per year on a ma.s.sive effort to create a "Cancer Genome Atlas," and in 2012 one of the first fruits of this project was published in Nature by more than 200 scientists who detailed genetic peculiarities in colon cancer tumors. Their study of more than 224 tumors has been regarded as a potential turning point in the development of new drugs that will take advantage of vulnerabilities they found in the tumor cells.

In addition to focusing on genomic a.n.a.lyses of cancer, scientists are exploring virtually every conceivable strategy for curing cancers. They are investigating new possibilities for shutting off the blood supply to cancerous cells, dismantling their defense mechanisms, and boosting the ability of natural immune cells to identify and attack the cancer cells. Many are particularly excited about new strategies that involve proteomics-the decoding of all of the proteins translated by cancer genes in the various forms of cancer and targeting epigenetic abnormalities.

Scientists explain that while the human genome is often characterized as a blueprint, it is actually more akin to a list of parts or ingredients. The actual work of controlling cellular functions is done by proteins that carry out a "conversation" within and between cells. These conversations are crucial in understanding "systems diseases" like cancer.

One of the promising strategies for dealing with systemic disorders like cancer and chronic heart diseases is to strengthen the effectiveness of the body"s natural defenses. And in some cases, new genetic therapies are showing promise in doing so. A team of scientists at the University of California San Francisco Gladstone Inst.i.tutes of Cardiovascular Disease has dramatically improved cardiac function in adult mice by reprogramming cells to restore the health of heart muscles.

IN MANY IF not most cases, though, the most effective strategy for combating chronic diseases is to make changes in lifestyles: reduce tobacco use, reduce exposure to carcinogens and other harmful chemicals in the environment, reduce obesity through better diet and more exercise, and-at least for salt-sensitive individuals-reduce sodium consumption in order to reduce hypertension (or high blood pressure).

Obesity-which is a major causal factor in multiple chronic diseases-was the subject of discouraging news in 2012 when the British medical journal The Lancet published a series of studies indicating that one of the princ.i.p.al factors leading to obesity, physical inactivity and sedentary lifestyles, is now spreading from North America and Western Europe to the rest of the world. Researchers a.n.a.lyzed statistics from the World Health Organization to demonstrate that more people now die every year from conditions linked with physical inactivity than die from smoking. The statistics indicate that one in ten deaths worldwide is now due to diseases caused by persistent inactivity.

Nevertheless, there are good reasons to hope that new strategies combining knowledge from the Life Sciences Revolution with new digital tools for monitoring disease states, health, and wellness may spread from advanced countries as cheaper smartphones are sold more widely throughout the globe. The use of intelligent digital a.s.sistants for the management of chronic diseases (and as wellness coaches) may have an extremely positive impact.

In developed nations, there are already numerous smartphone apps that a.s.sist those who wish to keep track of how many calories they consume, what kinds of food they are eating, how much exercise they are getting, how much sleep they are getting (some new headbands also keep track of how much deep sleep, or REM sleep, they are getting), and even how much progress they are making in dealing with addictions to substances such as alcohol, tobacco, and prescription drugs. Mood disorders and other psychological maladies are also addressed by self-tracking programs. During the 2012 summer Olympic Games in London, a number of athletes were persuaded by biotech companies attempting to improve their health-tracking devices to use glucose monitors and sleep monitors, and to receive genetic a.n.a.lyses designed to improve their individual nutritional needs.

Such monitoring is not limited to Olympians. Personal digital monitors of patients" heart rates, blood glucose, blood oxygenation, blood pressure, body temperature, respiratory rate, body fat levels, sleep patterns, medication use, exercise, and more are growing more common. Emerging developments in nanotechnology and synthetic biology also hold out the prospect of more sophisticated continuous monitoring from sensors inside the body. Nan.o.bots are being designed to monitor changes in the bloodstream and vital organs, reporting information on a constant basis.

Some experts, including Dr. H. Gilbert Welch of Dartmouth, the author of Overdiagnosed: Making People Sick in the Pursuit of Health, believe that we are in danger of going too far in monitoring and data a.n.a.lysis of individuals who track their vital signs and more: "Constant monitoring is a recipe for all of us to be judged "sick." Judging ourselves sick, we seek intervention." Welch and some others believe that many of these interventions turn out to be costly and unnecessary. In 2011, for example, medical experts advised doctors to stop routinely using a new and sophisticated antigen test for prostate cancer precisely because the resulting interventions were apparently doing more harm than good.

The digitizing of human beings, with the creation of large files containing detailed information about their genetic and biochemical makeup and their behavior, will also require attention to the same privacy and information security issues discussed in Chapter 2. For the same reasons that this rich data is potentially so useful in improving the efficacy of health care and reducing medical costs, it is also seen as highly valuable to insurance companies and employers who are often eager to sever their relationships with customers and employees who represent high risks for big medical bills. Already, a high percentage of those who could benefit from genetic testing are refusing to have the information gathered for fear that they will lose their jobs and/or their health insurance.

A few years ago, the United States pa.s.sed a federal law known as the Genetic Information Nondiscrimination Act, which prohibits the disclosure or improper use of genetic information. But enforcement is difficult and trust in the law"s protection is low. The fact that insurance companies and employers usually pay for the majority of health care expenditures-including genetic testing-further reinforces the fear by patients and employees that their genetic information will not remain confidential. Many believe that flows of information on the Internet are vulnerable to disclosure in any case. The U.S. law governing health records, the Health Insurance Portability and Accountability Act, fails to guarantee patient access to records gathered from their own medical implants while companies seek to profit from personalized medical information.

Nevertheless, these self-tracking techniques-part of the so-called self-quantification movement-offer the possibility that behavior modification strategies that have traditionally been a.s.sociated with clinics can be individualized and executed outside of an inst.i.tutional setting. Expenditures for genetic testing are rising rapidly as prices for these tests continue to fall rapidly and as the wave of personalized medicine continues to move forward with increasing speed.

The United States may have the most difficulty in making the transition to precision medicine because of the imbalance of power and unhealthy corporate control of the public policy decision-making process, as described in Chapter 3. This chapter is not about the U.S. health care system, but it is interesting to note that the glaring inefficiencies, inequalities, and absurd expense of the U.S. system are illuminated by the developing trends in the life sciences. For example, many health care systems do not cover disease prevention and wellness promotion expenditures, because they are princ.i.p.ally compensated for expensive interventions after a patient"s health is already in jeopardy. The new health care reform bill enacted by President Obama required coverage of preventive care under U.S. health care plans for the first time.

As everyone knows, the U.S. spends far more per person on health care than any other country while achieving worse outcomes than many other countries that pay far less, and still, tens of millions do not have reasonable access to health care. Lacking any other option, they wait, often until their condition is so dire that they have to go to the emergency room, where the cost of intervention is highest and the chance of success is lowest. The recently enacted reforms will significantly improve some of these defects, but the underlying problems are likely to grow worse-primarily because insurance companies, pharmaceutical companies, and other health care providers retain almost complete control over the design of health care policy.

THE STORY OF INSURANCE.

The business of insurance began as far back as ancient Rome and Greece, where life insurance policies were similar to what we now know as burial insurance. The first modern life insurance policies were not offered until the seventeenth century in England. The development of extensive railroad networks in the United States in the 1860s led to limited policies protecting against accidents on railroads and steamboats, and that led, in turn, to the first insurance policies protecting against sickness in the 1890s.

Then, in the early 1930s, when advances in medical care began to drive costs above what many patients could pay on their own, the first significant group health insurance policies were offered by nonprofits: Blue Cross for hospital charges and Blue Shield for doctors" fees. All patients paid the same premiums regardless of age or preexisting conditions. The success of the Blues led to the entry into the marketplace of private, for-profit health insurance companies, who began to charge different premiums to people based on their calculation of the risk involved-and refused to offer policies at all to those who represented an unacceptably high risk. Soon, Blue Cross and Blue Shield were forced by the new for-profit compet.i.tion to also link premiums to risk.

When President Franklin Roosevelt was preparing his package of reforms in the New Deal, he twice took preliminary steps-in 1935 and again in 1938-to include a national health insurance plan as part of his legislative agenda. On both occasions, however, he feared the political opposition of the American Medical a.s.sociation and removed the proposal from his plans lest it interfere with what he regarded as more pressing priorities in the depths of the Great Depression: unemployment compensation and Social Security. The introduction of legislation in 1939 by New York Democratic senator Robert Wagner offered a quixotic third opportunity to proceed but Roosevelt chose not to support the legislation.

During World War II, with wages (and prices) controlled by the government, private employers began to compete for employees, who were scarce due to the war, by offering health insurance coverage. Then after the war, unions began to include demands for more extensive health insurance as part of their negotiated contracts with employers.

Roosevelt"s successor, Harry Truman, sought to revive the idea for national health insurance, but the opposition in Congress-once again fueled by the AMA-ensured that it died with a whimper. As a result, the hybrid system of employer-based health insurance became the primary model in the United States. Because older Americans and those with disabilities had a difficult time obtaining affordable health insurance within this system, new government programs were implemented to help both groups.

For the rest of the country, those who needed health insurance the most had a difficult time obtaining it, or paying for it when they could find it. By the time the inherent flaws and contradictions of this model were obvious, the American political system had degraded to the point that the companies with an interest in seeing this system continued had so much power that nothing could be done to change its basic structure.

With rare exceptions, the majority of legislators are no longer capable of serving the public interest because they are so dependent on campaign contributions from these corporate interests and so vulnerable to their nonstop lobbying. The general public is effectively disengaged from the debate, except to the extent that they absorb constant messaging from the same corporate interests-messages designed to condition their audience to support what the business lobbies want done.

GENETICALLY ENGINEERED FOOD.

The same sclerosis of democracy is now hampering sensible adaptations to the wave of changes flowing out of the Life Sciences Revolution. For example, even though polls consistently show that approximately 90 percent of American citizens believe that genetically engineered food should be labeled, the U.S. Congress has adopted the point of view advocated by large agribusiness companies-that labeling is unnecessary and would be harmful to "confidence in the food supply."

However, most European countries already require such labeling. The recent approval of genetically engineered alfalfa in the U.S. provoked a larger outcry than many expected and the "Just Label It" campaign has become the centerpiece of a new gra.s.sroots push for labeling genetically modified (GM) food products in the United States, which plants twice as many acres in GM crops as any other country. Voters in California defeated a referendum in 2012 to require such labeling, after corporate interests spent $46 million on negative commercials, five times as much as proponents. Nevertheless, since approximately 70 percent of the processed foods in the U.S. contain at least some GM crops, this controversy will not go away.

By way of background, the genetic modification of plants and animals is, as enthusiastic advocates often emphasize, hardly new. Most of the food crops that humanity has depended upon since before the dawn of the Agricultural Revolution were genetically modified during the Stone Age by careful selective breeding-which, over many generations, modified the genetic structure of the plants and animals in question to manifest traits of value to humans. As Norman Borlaug put it, "Neolithic women accelerated genetic modifications in plants in the process of domesticating our food crop species."

By using the new technologies of gene splicing and other forms of genetic engineering, we are-according to this view-merely accelerating and making more efficient a long-established practice that has proven benefits and few if any detrimental side effects. And outside of Europe (and India) there is a consensus among most farmers, agribusinesses, and policymakers that GM crops are safe and must be an essential part of the world"s strategy for coping with antic.i.p.ated food shortages.

However, as the debate over genetically modified organisms (GMOs) has evolved, opponents of the practice point out that none of the genetic engineering has ever produced any increase in the intrinsic yields of the crops, and they have raised at least some ecosystem concerns that are not so easily dismissed. The opponents argue that the insertion of foreign genes into another genome is, in fact, different from selective breeding because it disrupts the normal pattern of the organism"s genetic code and can cause unpredictable mutations.

The first genetically engineered crop to be commercialized was a new form of tomato known as the FLAVR SAVR, which was modified to remain firm for a longer period of time after it ripened. However, the tomato did not succeed due to high costs. And consumer resistance to tomato paste made from these tomatoes (it was clearly labeled as a GM product) caused the paste to be a failure.

Selective breeding was used to make an earlier change in the traits of commercial tomatoes in order to produce a flatter, less rounded bottom to accommodate the introduction of automation in the harvesting process. The new variety stayed on the conveyor belts without rolling off, was easier to pack into crates, and its tougher skin prevented the machines from crushing the tomatoes. They are sometimes called "square tomatoes," though they are not really square.

An even earlier modification of tomatoes, in 1930, also using selective breeding, was the one that resulted in what most tomato lovers regard as a catastrophic loss of flavor in modern tomatoes. The change was intended to enhance the ma.s.s marketing and distribution of tomatoes by ensuring that they were "all red" and ripened uniformly, without the green "shoulders" that consumers sometimes viewed as a sign that they were not yet ripe. Researchers working with the newly sequenced tomato genome discovered in 2012 that the elimination of the gene a.s.sociated with green shoulders also eliminated the plant"s ability to produce most of the sugars that used to give most tomatoes a delicious taste.

In spite of experiences such as these, which ill.u.s.trate how changes made for the convenience and profitability of large corporations sometimes end up triggering other genetic changes that most people hate, farmers around the world-other than in the European Union-have adopted GM crops at an accelerating rate. Almost 11 percent of all the world"s farmland was planted in GM crops in 2011, according to an international organization that promotes GMOs, the International Service for the Acquisition of Agri-biotech Applications. Over the last seven years, the number of acres planted in GM crops has increased almost 100-fold, and the almost 400 million acres planted in 2011 represented an increase of 8 percent from one year earlier.

Although the United States is by far the largest grower of GM crops, Brazil and Argentina are also heavily committed to the technology. Brazil, in particular, has adopted a fast-track approval system for GMOs and is pursuing a highly focused strategy for maximizing the use of biotechnology in agriculture. In developing countries overall, the adoption of modified crops is growing twice as fast as in mature economies. An estimated 90 percent of the 16.7 million farmers growing genetically engineered crops in almost thirty countries were small farmers in developing markets.

Genetically modified soybeans, engineered to tolerate Monsanto"s Roundup herbicide, are the largest GM crop globally. Corn is the second most widely planted GM crop, although it is the most planted in the U.S. ("Maize" is the term used for what is called corn in the U.S.; the word "corn" is often used outside the U.S. to refer to any cereal crop.) In the U.S., 95 percent of soybeans planted and 80 percent of corn are grown from patented seeds that farmers must purchase from Monsanto or one of their licensees. Cotton is the third most planted GM crop globally, and canola (known as "rapeseed" outside the United States) is the other large GM crop in the world.

Although the science of genetically engineered plants is advancing quickly, the vast majority of GM crops grown today are still from the first of three generations, or waves, of the technology. This first wave, in turn, includes GM crops that fall into three different categories: * The introduction of genes that give corn and cotton the ability to produce their own insecticide inside the plants; * Genes introduced into corn, cotton, canola, and soybeans that make the plants tolerant of two chemicals contained in widely used weed killers that are produced by the same company-Monsanto-that controls the GM seeds; and * The introduction of genes designed to enhance the survivability of crops during droughts.

In general, farmers using the first wave of GM crops report initial reductions in their cost of production-partly due to temporarily lower use of insecticide-and temporarily lower losses to insects or weeds. The bulk of the economic benefits thus far have gone to cotton farmers using a strain that is engineered to produce its own insecticide (Bacillus thuringiensis, better known as Bt). In India the new Bt cotton made the nation a net exporter, rather than importer, of cotton and was a factor in the initial doubling of cotton yields because of temporarily lower losses to insects and weeds. However, many Indian cotton farmers have begun to protest the high cost of the GM seeds they must purchase anew each year and the high cost of the herbicides they must use in greater volumes as more weeds develop resistance. A parliamentary panel in India issued a controversial 2012 report a.s.serting that "there is a connection between Bt cotton and farmers" suicides" and recommending that field trials of GM crops "under any garb should be discontinued forthwith."

New scientific studies-including a comprehensive report by the U.S. National Research Council in 2009-support the criticism by opponents of GM crops that the intrinsic yields of the crops themselves are not increased at all. To the contrary, some farmers have experienced slightly lower intrinsic yields because of unexpected collateral changes in the plants" genetic code. Selective breeding, on the other hand, was responsible for the impressive and life-saving yield increases of the Green Revolution. New research by an Israeli company, Kaiima, into a non-GMO technology known as "enhanced ploidy" (the inducement, selective breeding, and natural enhancement of a trait that confers more than two sets of chromosomes in each cell nucleus) is producing both greater yields and greater resistance to the effects of drought in a variety of food and other crops. Recent field trials run by Kaiima show more than 20 percent yield enhancement in corn and more than 40 percent enhancement in wheat.

The genetic modification of crops, by contrast, has not yet produced meaningful enhancements of survivability during drought. While some GM experimental strains do, in theory, offer the promise of increased yields during dry periods, these strains have not yet been introduced on a commercial scale, and test plots have demonstrated only slight yield improvements thus far, and only during mild drought conditions. Because of the growing prevalence of drought due to global warming, there is tremendous interest in drought-resistant strains, especially for maize, wheat, and other crops in developing countries. Unfortunately, however, drought resistance is turning out to be an extremely complex challenge for plant geneticists, involving a combination of many genes working together in complicated ways that are not yet well understood.

After an extensive a.n.a.lysis of the progress in genetically engineering drought-resistant crops, the Union of Concerned Scientists found "little evidence of progress in making crops more water efficient. We also found that the overall prospects for genetic engineering to significantly address agriculture"s drought and water-use challenges are limited at best."

The second wave of GM crops involves the introduction of genes that enhance the nutrient value of the plants. It includes the engineering of higher protein content in corn (maize) that is used primarily for livestock feed, and the engineering of a new strain of rice that produces extra vitamin A as part of a strategy to combat the deficiency in vitamin A that now affects approximately 250 million children around the world. This second wave also involves the introduction of genes that are designed to enhance the resistance of plants to particular fungi and viruses.

The third wave of GM crops, which is just beginning to be commercialized, involves the modification of plants through the introduction of genes that program the production of substances within the plants that have commercial value as inputs in other processes, including pharmaceutical inputs and biopolymers for the production of bioplastics that are biodegradable and easily recyclable. This third wave also involves an effort to introduce genes that modify plants with high cellulose and lignin in order to make them easier to process for the production of cellulosic ethanol. The so-called green plastics have exciting promise, but as with crops devoted to the production of biofuels, they raise questions about how much arable land can safely or wisely be diverted from the production of food in a world with growing population and food consumption, and shrinking a.s.sets of topsoil and water for agriculture.

Over the next two decades, seed scientists believe that they may be able to launch a fourth wave of GM crops by inserting the photosynthesizing genes of corn (and other so-called C4 plants) that are more efficient in photosynthesizing light into energy in plants like wheat and rice (and other C3 plants). If they succeed-which is far from certain because of the unprecedented complexity of the challenge-this technique could indeed bring about significant intrinsic yield increases. For the time being however, the overall net benefits from genetically engineered crops have been limited to a temporary reduction in losses to pests and a temporary decrease in expenditures for insecticides.

In 2012, the Obama administration in the U.S. launched its National Bioeconomy Blueprint, specifically designed to stimulate the production-and procurement by the government-of such products. The European Commission adopted a similar strategy two months earlier. Some environmental groups have criticized both plans because of the growing concern about diverting cropland away from food production and the destruction of tropical forests to make way for more cropland.

The opponents of genetically modified crops argue that not only have these genetic technologies failed thus far to increase intrinsic yields, but also that the weeds and insects the GM crops are designed to control are quickly mutating to make themselves impervious to the herbicides and insecticides in question. In particular, the crops that are engineered to produce their own insecticide (Bacillus thuringiensis) are now so common that the constant diet of Bt being served to pests in large monocultured fields is doing the same thing to insects that the ma.s.sive and constant use of antibiotics is doing to germs in the guts of livestock: it is forcing the mutation of new strains of pests that are highly resistant to the insecticide.

The same thing also appears to be happening to weeds that are constantly sprayed with herbicides to protect crops that have been genetically engineered to survive application of the herbicide (including princ.i.p.ally Monsanto"s Roundup, which is based on glyphosate, which used to kill virtually any green plant). Already, ten species of harmful weeds have evolved a resistance to these herbicides, requiring farmers to use other more toxic herbicides. Some opponents of GM crops have marshaled evidence tending to show that over time, as resistance increases among weeds and insects, the overall use of both herbicides and pesticides actually increases, though advocates of GM crops dispute their a.n.a.lysis.

Because so many weeds have now developed resistance to glyphosate (most commonly used in Roundup), there is a renewed market demand for more powerful-and more dangerous-herbicides. There are certainly plenty to choose from. The overall market for pesticides in the world represents approximately $40 billion in sales annually, with herbicides aimed at weeds representing $17.5 billion and both insecticides and fungicides representing about $10.5 billion each.

Dow AgroSciences has applied for regulatory approval to launch a new genetically engineered form of corn that tolerates the application of a pesticide known as 2,4-D, which was a key ingredient in Agent Orange-the deadly herbicide used by the U.S. Air Force to clear jungles and forest cover during the Vietnam War-which has been implicated in numerous health problems suffered by both Americans and Vietnamese who were exposed to it. Health experts from more than 140 NGOs have opposed the approval of what they call "Agent Orange corn," citing links between exposure to 2,4-D and "major health problems such as cancer, lowered sperm counts, liver toxicity and Parkinson"s disease. Lab studies show that 2,4-D causes endocrine disruption, reproductive problems, neurotoxicity, and immunosuppression."

Insecticides that are sprayed on crops have also been implicated in damage to beneficial insects and other animals. The milkweed plants on which monarch b.u.t.terflies almost exclusively depend have declined in the U.S. farm belt by almost 60 percent over the last decade, princ.i.p.ally because of the expansion of cropland dedicated to crop varieties engineered to be tolerant of Roundup. There have been studies showing that Bt crops (the ones that produce insecticide) have had a direct harmful impact on at least one subspecies of monarchs, and on lacewings (considered highly beneficial insects), ladybird beetles, and beneficial biota in the soil. Although proponents of GM crops have minimized the importance of these effects, they deserve close scrutiny as GM crops continue to expand their role in the world"s food production.

Most recently, scientists have attributed the disturbing and previously mysterious sudden collapses of bee colonies to a new group of pesticides known as neonicotinoids. Colony collapse disorder (CCD) has caused deep concern among beekeepers and others since the affliction first appeared in 2006. Although numerous theories about the cause of CCD were put forward, it was not until the spring of 2012 that several studies pinpointed the cause.