Page 2 of 2
Given this knowledge, the modern tools of biotechnology allow us to do something amazing: We can alter the translational code within an organism by modifying the DNA bases of its genome, making the organism effectively immune to viral infection. My colleagues and I are exploring this within E. coli, the microbial powerhouse of the biotech world. By simply changing a certain 314 of the 5 million bases in the E. coli genome, we can change one of its 64 codons. In 2009 this massive (albeit nanoscale) construction project is nearing completion via breakthroughs in our ability to “write” genomes. This process is increasingly automated and inexpensive — soon it will be relatively easy to change multiple codons. Viral genomes range from 5,000 to a million bases in length, and each of the 64 codons is present, on average, 20 times. This means that to survive the change of a single codon in its host, a virus would require 20 simultaneous, specific, spontaneous changes to its genome. Even in viruses with very high mutation rates, for example HIV, the chance of getting a mutant virus with the correct 20 changes and zero lethal mutations is infinitesimally small.
Altering the translational codes of genetically engineered organisms (GEOs) could have an important additional benefit. GEOs are very unpopular in some communities (e.g., Europe) in part because of concerns that engineered genes might become ecologically invasive, a sort of molecular kudzu. This is a legitimate concern, but we must also acknowledge that it’s insufficient to simply always choose “natural” over “unnatural.” Utilizing new translational codes in GEOs might provide the isolation from functional gene exchange that we’ve been looking for.
If we engineer organisms to be resistant to all viruses, we must anticipate that without viruses to hold them in check, these GEOs could take over ecosystems. This might be handled by making engineered cells dependent on nutritional components absent from natural environments. For example, we can delete the genes required to make diaminopimelate, an organic compound that is essential for bacterial cell walls (and hence bacterial survival) yet very rare in humans and our environment. The geneticist Roy Curtiss and his colleagues have already pioneered this protective measure. Or perhaps we can make our favorite GEO strain addicted to a totally unnatural amino acid like fluorotryptophan, as conceived by Andrew Ellington and his coworkers. Even if such GEOs escaped the laboratory, they would not find fluorotryptophan or diaminopimelate and would quickly die — and they couldn’t be rescued by exchanging DNA with other microbes.
But actions speak louder than words. These safety features will be accepted and used only if they undergo rigorous testing in physical isolation and review by a diversity of critics. The battery of necessary tests is formidable, and includes ensuring that GEOs are not toxic to immunocompromised lab animals, as well as lab examinations of ecological challenges like unwanted gene transfer and harmful mutations. If we can construct safety measures that pass all these tests, the door will be opened to potentially allow more sophisticated biotechnological interventions in areas like human health.
We already have a mandate in the form of the emergence of the HIV pandemic; those infected currently require a lifetime of expensive drugs to stay symptom free. A once-in-a-lifetime injection of bioengineered stem cells capable of making HIV-resistant blood T-cells might seem more cost effective — and might be closer at hand than the elusive HIV vaccine. We routinely transplant blood stem cells based on pioneering work from Don Thomas in the 1950s. Current limitations like taking cells from bone and irradiating the recipient are inefficient and dangerous; these obstacles could be overcome with bioengineering. In that context, the removal of viral receptors or addition of antiviral gene networks to those stem cells could become very attractive strategies. Further, the problems of cancer and aging lie in the fundamental “design” of our genomes. It would be surprising if we could fix such planned obsolescence with pharmaceuticals consisting of a few atoms (or “bits” of target-binding information) — but with proper bioengineering, we could change the gigabits of faulty software in our cells.
Still, all discussions of accelerating technology, unintended consequences, and safeguards could eclipse a larger concern: Are we simply going too fast? How do we decide on an optimal pace for our technological progression? We have become accustomed to twofold improvement every two years in the costs of computing and digital telecommunications — from French semaphore lines in 1792 to multiplexed optical fibers today. The pace of cost improvement in “reading” DNA followed a similar curve from 1968 until recently, but in 2004 it suddenly jumped to tenfold per year, a pace that continues today. Similar exponential advances in “writing” DNA have been evident since the 1970s. These three exponential technologies might become increasingly synergetic, with potentially profound effects.
My hope for the future is that our accelerating technologies will bring improvements in standards of living, accompanied by shifts to sustainable population sizes and increased health care and education. At the other extreme, physical or social limitations could cause technology to level off and stagnate exactly at a time that we desperately need to make rapid progress. Ultimately, our future will be what we make of it. Let us choose wisely, with carefully engineered safety and broad community engagement. — George Church is director of the Center for Computational Genetics at Harvard Medical School.
Originally published February 2, 2009
Page 2 of 2