Credit: Flickr user Toby___
Humans are carbon-bond consumers. Carbon bonds come into the front end of your feeding tubes in the form of fats, carbohydrates, and proteins; you then break those chemical bonds to extract energy, and excrete the residue as carbon dioxide, water vapor and various solid waste. Sometimes, however, some of these chemicals can make their way from your digestive system and into your brain; the consequences can be subtle or profound.
The distinction between what is considered a food (something that your body wants or needs in order to function optimally) or a drug (something that your brain wants or needs in order to function optimally) is becoming increasingly difficult to define. Indeed, the routine use of some substances, such as stimulants and depressants, is so universal that most of us do not even consider them to be drugs, but, rather, actual food. Is coffee, tea, tobacco, alcohol, cocoa, or marijuana a nutrient or a drug?
In truth, anything you take into your body should be considered a drug, whether it’s obviously nutritious or not. As you will see, even molecules that are clearly nutritious (such as essential amino acids like lysine and tryptophan—available in bulk at your nearest grocery store) exhibit properties that many of us would attribute to a drug.
The foods we eat, and many of our most popular psychoactive drugs, come from plants or animals. The ingredients in these plant and animal products are very similar if not identical to the neurotransmitters our brains and bodies use to function normally. This is why the contents of our diets can interact with our neurons to influence brain function, and it highlights a very important principle: The chemicals in the food that you eat will only act upon your brain if in some way those chemicals resemble an actual neurotransmitter or otherwise interact with a biochemical process in your brain that influences the production, release, or inactivation of a neurotransmitter. These “active” ingredients deserve close scrutiny.
How is it possible that plants and humans use such similar chemicals for normal, everyday functions? Plants produce chemicals that are capable of affecting our brain because they share an evolutionary history with us on this planet. Even primitive one-celled organisms produce many of the same chemicals that are in our brains. Therefore, whether you choose to eat a bunch of broccoli, a plate of sushi, or a heap of amoeba, the chemicals each meal contains may alter how your neurons function and, therefore, how you feel or think. We have all experienced the consequences of our shared evolutionary history with the plants we eat. For example, unripe bananas contain the neurotransmitter serotonin. When you eat an unripe banana, its serotonin is free to act upon the serotonin neurons within your digestive tract. The consequence is likely to be increased activation of the muscles in the wall of your intestines, usually experienced as diarrhea.
Many plants contain compounds that should be able to enhance your brain’s performance. For example, potatoes, tomatoes, and eggplants contain solanine and α-chaconine, substances that can enhance the action of acetylcholine, a chemical in your brain that is vital to memory formation. Your mood might be enhanced slightly by eating fava beans because they contain L-DOPA, a precursor to the production of dopamine, the reward chemical in your brain. Whether these food-borne compounds actually affect your brain depends upon how much you consume and your own personal physiology. This might explain why some people find it quite rewarding to eat potatoes or eggplants.
Morphine-like chemicals capable of acting upon the brain are produced in your intestines when you consume milk, eggs, cheese, spinach, mushrooms, pumpkin, and various fish and grains. Dairy products in particular contain a protein known as casein, which enzymes in your intestines can convert into beta-casomorphin. In newborns, that beta-casomorphin can easily pass out of the immature gut and into the developing brain to produce euphoria.
The pleasurable feeling produced by this opiate-like compound in newborn mammals after their first taste of their mother’s milk is believed to encourage the infant to return again and again for nourishment. Thus, being able to experience the euphoria induced by this opiate-like chemical has life and death consequences for the newborn child. Adults do not experience this euphoria after drinking milk due to the presence of well-developed blood–gut and blood–brain barriers. Perhaps if a glass of milk provided us with the euphoria of opiates and the pain relief of morphine, then dairy cows would only be sold on the black market!
Even the spices we use to flavor our food and drink may contain psychoactive chemicals that can alter the function of the brain. The spice nutmeg comes from the nutmeg tree, Myristica fragrans, and contains myristicin, which is chemically quite similar to mescaline and amphetamine. Myristicin and related compounds are also found in carrots, parsley, fennel, dill, and a few other spices—but at very low concentrations, so not to worry about getting intoxicated at the hors d’oeuvres tray. Typically, one must consume about 30 grams of nutmeg powder—or roughly the contents of an entire container of the product that you could purchase at your local grocery store—to experience its psychoactive effects. A single slice of pumpkin pie is unlikely to produce any noticeable effects upon the psyche. Reactions to nutmeg vary considerably, from nothing at all, to euphoria at low doses, to marijuana- and LSD-like experiences at higher doses, with hallucinations that can last up to 48 hours. It all depends upon the amount consumed.
Similar to the unpleasant repercussions of eating unripe bananas, one other side effect of consuming nutmeg is extreme diarrhea caused by the stimulation of sensitive neurons within the intestines. Given this disagreeable reaction, it is surprising that nutmeg has also been claimed to be a potent aphrodisiac. Perhaps for these reasons, one of my students once consumed an entire canister of nutmeg that he had dissolved in some applesauce; the weekend he spent in the bathroom rather than the bedroom demonstrated why most people never try nutmeg more than once.
Regarding caffeine and nicotine, arguably our two most beloved plant-derived drugs, so much has been written that I will not repeat any of it here. However, as a self-confessed chocoholic I am compelled to discuss why chocolate can be so addicting. Chocolate contains a bit of caffeine, but also an array of other psychoactive compounds that may contribute to the pleasurable sensation of eating it. Chocolate contains phenethylamine, a molecule that resembles amphetamine, and a small amount of a chemical called anandamide, which resembles the active ingredient in marijuana. Anandamide happens to be used by our brain as a regular neurotransmitter and appears to be critical for us to experience pleasure.
Chocolate also contains some estrogen-like compounds, a fact that may explain a recent series of reports showing that men who eat chocolate live longer than men who do not eat chocolate. The effect was not seen for women, who have an ample supply of their own estrogen until menopause. Post-menopausal women still may gain benefits from being chocoholics, though, because chocolate also contains magnesium salts that may reduce the frequency and severity of hot flashes and night sweating. And finally, a standard bar of chocolate contains as many antioxidants as a glass of red wine. Clearly, there are many good reasons for men and women to eat chocolate to obtain its indescribably soothing, mellow, and anxiety-reducing effect.
Chocolate, however, is a newcomer compared to humanity’s first known anxiety-reducing agent: ethyl alcohol. There is evidence that fermentation of grains to make alcohol-containing beverages may have begun in the Caucasus region more than 10,000 years ago. The ancient Egyptians produced alcoholic beverages and there are some passages within their texts referring to the social problems associated with drunkenness. Uisce beatha, meaning “water of life,” was the name given by Irish monks in the 6th century to a drink they prepared, a drink that today we know as whiskey. The drink was a surprisingly excellent source of nutrition, although absent some essential water-soluble B-vitamins.
The first documented distillation of alcohol was the conversion of wine into brandy during the Middle Ages at a medical school in Salerno, Italy. Once again, the new beverage became known as Aqua vitae, Latin for “the water of life.” Brandy became the primary distilled liquor in Europe until the middle of the 17th century when the Dutch perfected the process of distilling liquor and flavoring it with juniper berries to make gin.
Alcohol enhances the widespread inhibitory action of the neurotransmitter GABA and acts as a depressant on the central nervous system. For this reason, in the 19th century, alcohol was widely used as a general anesthetic. Unfortunately, the duration of its depressant action on the brain was too long and could not be controlled easily or safely. The effective dose for surgical analgesia using alcohol is very close to its lethal dose. Therefore, it was possible to induce sufficient anesthesia for a cowboy to remove an arrow from his leg, but it was unlikely that the unfortunate cowboy would survive the operation. If the arrow did not kill him, the operation certainly could. Of course, prior to the 20th century, this was generally true for many medicines and therapies.
To conclude, because of your shared evolutionary history with the plants and animals on this planet, when you consume them you risk having their chemicals affect how you feel and even how you think. The degree to which they influence your cognitive functioning depends upon how easily they can achieve an adequate concentration in your brain. As you’ve read, this depends upon how much of each chemical is contained in your diet, your age and physical health, and the status of your blood-gut and blood-brain barriers. With these concepts in mind, consider the following science fiction scenario: a spaceman is walking on an Earth-like planet and is suddenly bitten by an unfriendly fanged creature. The spaceman can see that he is injured and that the beast’s bite injected a liquid substance, perhaps venom, beneath his skin. Does he die?
No, he does not die, because his species and that of the creature on this foreign planet do not share an evolutionary past or a common ancestor. Although they may both be made of proteins formed from amino acids, their independent evolutionary paths should made it highly improbable that they use similar neurotransmitter molecules within their respective brains and bodies. Every spaceman from Flash Gordon to Captain Kirk to Luke Skywalker should feel safe walking around any planet (except their own) with impunity from animal and plant toxins. For this same reason, the intoxicating drinks and powerful medicines that always seem to be popular in these foreign worlds in science fiction movies would also have totally different effects, if any effects at all, on the brains of our plucky spaceman. Eating otherworldly foods might be the most disappointing and distressing experience of all: Even if they were filling and somehow tasted delicious, as products of utterly alien biochemistries they would probably prove devoid of nourishment for our Earthly bodies. Thus, starvation might be the greatest threat to any future explorers of alien biospheres. Unless, perhaps, they’d brought along a large supply of chocolate.
Gary Wenk is a professor at Ohio State University, a specialist in the effects of drugs upon the brain, and the author of Your Brain on Food.
Originally published September 13, 2010