We are searching data for your request:
Upon completion, a link will appear to access the found materials.
How is it decided what type of food we want to eat? For example, sometimes you would like to eat some sweets but not a full meal, other times you'd like to eat some fruit, sometimes not and so on.
I have read somewhere that organism can detect whether we are hungry or not and it gives the effect so that we feel that we are actually hungry. But the question is: is there something that measures for example what is level of glucose or aminoacids or some fats in blood and specifies not only whether we are hungry or not but also what type of food we should eat to replenish lacking resources?
I'm no biologist but here are two answers to your multidimensional question.
"How is it decided what type of food we want to eat?" There is new research into neuroscience and microbiological science about a phenomena where bacteria in our gut that produce vitamins and digest things that we cannot digest, contribute to the choices we make for our diet. Don't ask me how, but from my understanding they must somehow communicate with the brain to release dopamine while stimulating neurons responsible for choosing a certain food to crave.
"Is there something that measures for example what is level of glucose or amino-acids or some fats in blood… ?" Well yes actually, let's take the pancreas for example. It knows when blood sugar is high from digested materials and will begin to produce insulin. I won't go much further because already there is a response based on the detection of sugars. Your entire digestive tract is based on these detective processes in order to appropriately digest and react to digested material. Are there processes that tell your brain what to eat next? Well besides the bacteria, there are plenty of hormones produced by your gastrointestinal tract such as Ghrelin which is the most common "hunger hormone" and is responsible for homeostasis aka energy regulation/distribution throughout your body.
It's better not to ask how but to ask why when researching topics similar to this, the how is almost always provided. I'm not entirely sure how those hormones communicate with the brain, my knowledge of the brain is less biochemical and more neurological (e.g. neuron formation).
PSYC 123: The Psychology, Biology and Politics of Food
Professor Brownell gives an overview of the course agenda. The psychological issues of food are introduced, such as who defines food, what promotes health, and how the food industry contributes to both debates. The biological issues that will be discussed include how people’s hard-wired preferences interact with a modern food environment. The political issues of the class will integrate food production, consumption, marketing, and world politics, with discussion of potential interventions for changing food preferences and food intake patterns in society.
About the expert: Linda Bartoshuk, PhD
Linda Bartoshuk, PhD, an APA fellow, is the Bushnell professor of food science and human nutrition at the University of Florida and director for psychophysical research at the university’s Center for Smell and Taste. Bartoshuk studies sensory perception of foods such as taste, olfaction and irritation/pain. She is known for her discovery of supertasters, people who have more taste buds than the average person, her work on enhancing the flavor of tomatoes and for discovering a condition known as burning mouth syndrome. Bartoshuk and her students developed new measurement techniques for quantifying sensations as well as the pleasure/displeasure these sensations evoke. Most recently, Bartoshuk has collaborated with horticulturists to connect sensory variation in fruit with variation in fruit palatability. She is frequently interviewed by national and international media including The Atlantic, BBC News, The Wall Street Journal, NPR and Popular Science.
Untangling what’s behind a craving for certain foods—whether it’s chocolate chips or potato chips—is one thing. But what about the desire to eat things that aren’t foods? The annals of medicine, literature, history, anthropology and psychology are peppered with examples of people who eat items as varied as dirt, clay, paper, plaster, laundry starch, spoons or nails.
The general term for eating nonfood items is pica. According to the National Institutes of Health, pica occurs more often in young children than in adults, appearing to some degree in 10 percent to 30 percent of children ages one to six. It’s also observed more often in pregnant women than in the general population.
A more specific phenomenon is geophagy, the eating of earth, soil or clay. Its practice has been recorded worldwide and is noted as far back as the writings of Hippocrates. In modern times, it has been regarded from various perspectives as “a psychiatric disease, a culturally sanctioned practice or a sequel to poverty and famine,” write the authors of a history of earth-eating that appeared in the Journal of the Royal Society of Medicine in 2002. An analysis by a team of biologists in 2011 suggests that geophagy helps protect the stomach against toxins, parasites and pathogens.
“We hope readers agree that it is time to stop regarding geophagy as a bizarre, nonadaptive gustatory mistake,” wrote the lead investigator of that study, Sera Young of Cornell University.
Indeed, Tufts anthropologist Stephen Bailey points out that in American culture, people consume stomach-soothing products that contain calcium carbonate or bismuth compounds—minerals and metals that would not ordinarily be considered food.
In fact, the very idea of what is considered food differs from culture to culture, says Bailey, an associate professor in the School of Arts and Sciences. In many places where earth or clay are eaten, the substances are gathered, prepared and consumed in a specific, often ritually prescribed manner.—Helene Ragovin
These articles first appeared in the Winter 2014 issue of Tufts Nutrition magazine.
The Gatekeepers Who Get to Decide What Food Is “Disgusting”
At the Disgusting Food Museum, in Sweden, where visitors are served dishes such as fermented shark and stinky tofu, I felt both like a tourist and like one of the exhibits.
In the spring of 2019, Arthur De Meyer, a twenty-nine-year-old Belgian journalist, toured the Disgusting Food Museum, in Malmö, Sweden. As with the Museum of Sex, in New York City, and the Museum of Ice Cream, in San Francisco, the Disgusting Food Museum is conceptually closer to an amusement park than to a museum. There are eighty-five culinary horrors on display—ordinary fare and delicacies from thirty countries—and each tour concludes with a taste test of a dozen items. De Meyer, the son of a cookbook author and a food photographer, told me that he’d always been an adventurous eater. As a reporter, he also prided himself on his ability to maintain his composure. “But the taste test was war,” he said. “The kind where you’re defenseless, because the bombs are going off invisibly, inside of you.”
An Icelandic shark dish, called hákarl, was the first assault on his stomach. “Eating it was like gnawing on three-week-old cheese from the garbage that had also been pissed on by every dog in the neighborhood,” he said. Next up was durian, a spiky, custard-like fruit from Southeast Asia that “smelled like socks at the bottom of a gym locker, drizzled with paint thinner.” But worst of all was surströmming, a fermented herring that is beloved in northern Sweden. De Meyer said that eating it was like taking a bite out of a corpse.
He vomited ten times, topping the museum’s previous record of six. Mercifully, admission tickets are printed on airplane-style barf bags.
The Disgusting Food Museum, which opened in 2018, is the brainchild of Samuel West, a forty-seven-year-old psychologist who was born in California and has lived in Sweden for more than two decades. In 2016, during a trip to Zagreb, Croatia, he wandered into the Museum of Broken Relationships. As he studied the remnants of strangers’ failed romances—photos of hookup spots a diet book that a woman received from her fiancé—West came up with an idea for a museum dedicated to failed business products and services. A year later, in Helsingborg, Sweden, he opened the Museum of Failure, where the takeaway was simple: blunders are the midwives of success. One example on display at the museum was the Newton, a personal digital assistant released by Apple in 1993. Its shoddy handwriting software and exorbitant price nearly torpedoed the entire company, but its sleek black design eventually inspired the iPhone. The exhibits also included Bic for Her, a line of pens, from 2011, that were designed for women DivX, a 2003 trademark for “self-destructing” DVDs that could be watched for only forty-eight hours a collection of Harley-Davidson perfumes, from the mid-nineties and Trump: The Game, a Monopoly ripoff released in 1989. (The game was pulled from shelves after Trump said that it was “too complicated.”)
The Museum of Failure was a resounding commercial success, attracting visitors from across the world and attention from the Times, the Washington Post, and National Geographic. By 2018, though, West was on to his next project, after reading an article about how reducing beef consumption could slow climate change. The piece explained that a dire problem could be eased by a simple solution—eating insects, a good source of protein—but that the First World had rejected this idea out of disgust. West realized that if the experience of failure had expedited human innovation, then the experience of disgust was potentially holding us back. Could that aversion be challenged or changed? “I just wanted to know, Why is it that even talking about eating certain things makes my skin crawl?” he told me, animatedly, over Zoom.
The planning for the museum began with a more basic question: What counts as food? West recruited his friend Andreas Ahrens, a former I.T. entrepreneur and a foodie, to help him choose which items would qualify for exhibition. The men ruled out artificially flavored gag gifts—such as Rocket Fizz’s barf soda and Jelly Belly’s booger jelly beans—and novelty foods like deep-fried Oreos and a Polish beer that had been brewed with a woman’s vaginal yeast. Four hundred items made it through the initial screening, after which they were culled based on four criteria: taste, texture, smell, and the process by which they were made. Foie gras “failed” the taste, texture, and smell tests, which is to say that West and Ahrens found it inoffensive on those fronts. But the dish, which is typically produced by force-feeding ducks until their livers swell to ten times their normal size, easily passed the process test, earning itself a place at the museum. (According to Ahrens, many visitors, after reading about the process, swear to never eat foie gras again.) The winnowing of the foods was spirited and combative. West emerged as the bigger wimp he threw up so many times that he lost count. Ahrens found plenty of the foods unpleasant, but he got sick only after tasting balut, a Filipino egg-fetus snack that is eaten straight from the shell—feathers, beak, blood, and all.
After the men chose the items, they had to contend with customs and transportation. Svið, a traditional Icelandic dish in which a sheep’s head is cut in half and boiled, was impossible to procure, for “logistical reasons,” Ahrens said. The food is instead represented by a photo of the head next to helpings of mashed potatoes and pureed root vegetables. The same goes for ortolan, a nearly extinct French songbird, which is prepared by blinding the bird and then drowning it in brandy, a practice that is now banned in the European Union. Raw monkey brain, which was supposedly served at Chinese imperial banquets, is represented by a type of wooden table that would have been used to hold down a live monkey while the top of its head was sliced open and spooned out. (“It is unclear whether it’s an urban legend, or something that’s still being served in China,” an accompanying sign says.)
Even the foods that appear at the museum in their real forms posed unusual difficulties. To make cuy, a Peruvian dish, West had to watch several YouTube videos on how to skin and boil a guinea pig. “I sent my wife and children away the day I did it,” he recalled. “It just felt wrong, bordering on criminal.” For a South Korean wine that demanded the “fresh turds” of children, Ahrens found himself scooping up his eight-year-old daughter’s excrement and fermenting it with rice wine. The final product is on display at the museum, in a gallon jug, though Ahrens has not mustered the will to try it.
On Tripadvisor, the Disgusting Food Museum is ranked No. 1 on a list of ninety-four things to do in Malmö, the third-largest city in Sweden. Visitors are often surprised to find that the museum is situated on the first floor of a shopping mall, between a furniture store and an art gallery. Daniela Nusfelean, a Romanian college student who visited the museum in January, said that one of the first things she noticed was the absence of any odor. “This place is supposed to have so much food,” Nusfelean remembered thinking. “How can food not smell?”
The stinkier items are secured under bell jars, Ahrens, the museum’s director, said, when he gave me a tour over Zoom, earlier this year. Most foods, such as kale pache—an Iranian soup made from a sheep’s head and hooves, which are boiled overnight to eliminate any smells—were displayed in bowls or pots that sat atop a series of white tables, illuminated by long-necked lamps. (Some of the foods are made fresh every week others, like the poop wine, have a lengthy shelf life.) The museum, whose walls were bright and bare, looked as sterile as a science lab, until Ahrens, who wore a T-shirt that bore the museum’s logo and the word “Yuck!,” gestured to a chalkboard that read “2 days since last vomit.” “This is the scoreboard,” he said, grinning.
We went on to the exhibits, each of which was accompanied by a placard that, in English and Swedish, noted a dish’s history and its country of origin. First stop: dried stinkbugs from Zimbabwe, which vaguely resembled the buds of microgreen sprouts. Then there was kungu cake (East Africa), a dessert made from millions of crushed flies fried locusts (Israel), the only insect that the Torah considers kosher frog juice (Peru), a frothy green beverage containing frogs and quail eggs and mouse wine (China), a jug of rice wine infused with two hundred baby rodents.
“Gee, I dunno. What do you feel safe going out and doing?”
Eventually, Ahrens led me to a Warhol-esque wall of yellow and red cans. “Our most popular selfie destination,” he said, adding that the cans, which were full of surströmming, the fermented herring, had induced more vomiting than any other item in the museum. (“Surströmming is one of the worst smelling foods in the world,” a placard read.) The exhibit featured a smell jar, inviting visitors to lift the lid and to take a sniff. Before the pandemic, one of the highlights of the museum was a photo booth that sprayed jet streams of various scents—durian, stinky tofu (a fermented bean-curd dish)—and captured visitors’ facial expressions as they inhaled. “Instagram,” Ahrens explained.
The term “disgust” entered the English language more than four hundred years ago, from the Old French word desgouster, meaning “to put off one’s appetite.” But disgust wasn’t considered worthy of scientific examination until 1872, when Charles Darwin defined it as a reaction to “something revolting, primarily in relation to the sense of taste . . . and secondarily to anything which causes a similar feeling, through the sense of smell, touch and even of eyesight.” Darwin theorized that disgust is a basic human emotion—like anger, fear, or sadness—and that it is expressed with a universal “disgust face.” If you are presented with a glass of sour milk, you will almost certainly scrunch up your nose, purse your lips, and blow out air between them, making an “ack” or “ugh” sound through clenched teeth. If you are forced to drink the milk, you might open your mouth wide, tense your brows, and retract your upper lip to decrease inhalation, pinching your features into the likeness of the vomit-face emoji (all of which is often a precursor to the act itself).
There is a reason that we find certain foods offensive. A prehistoric human who scarfed down decomposing meat or bacteria-ridden feces wouldn’t have lived long. “Life would have been simpler if we were koala bears,” Daniel Fessler, an evolutionary anthropologist at U.C.L.A., told me. Koala bears eat only eucalyptus leaves, so there isn’t a lot of hand-wringing about what’s for dinner. But humans have made it a lot further in life than koalas, in large part because of our diet. Eating meat has allowed our digestive tracts to shrink and our brains to grow in outsized proportion to our bodies, because the animals we consume have already extracted the nutrients we need. Meat consumption, however, has also entangled our species in the omnivore’s dilemma: we must be flexible enough to consume a variegated diet, yet wary enough of novelty to avoid accidental death.
Evolutionary psychologists often cite the Swiss Army knife as an analogy for the mind, because both have all-purpose tools designed to cope with an unpredictable world. Disgust is simply one blade of many. If the blade is kept sharp, it helps you avoid disease, but if it becomes too sharp you might not ingest enough calories. “Evolution has optimized this trade-off so that priority is placed on the more urgent goal,” Fessler said. If you’re starving, then the blade is dulled: you may be more likely to eat something that you’d otherwise find disgusting, such as rotting leftovers. (As Cervantes wrote in “Don Quixote,” “Hunger is the best sauce.”) “The key point here is that people do not need to make conscious decisions about these trade-offs,” Fessler said. Evolved psychological mechanisms do the work.
Disgust may have originated as a food-rejection system, Paul Rozin, a psychology professor at the University of Pennsylvania, told me, “but it has expanded into a vehicle for perceiving the social and moral world.” Rozin is the pioneer of a subfield called disgust studies. His favorite experiment involves dropping a cockroach into a glass of juice. Most people, of course, refuse to drink the juice, citing the dirtiness of cockroaches. “What’s amazing is that even if you disinfect the cockroach and convincingly demonstrate that the juice is harmless, people still won’t want to drink it,” Rozin said. The juice has been irrevocably contaminated.
The concept of contamination is one example of how biology maps onto cultural systems. Both Islam and Judaism forbid the consumption of pork many cultures avoid other kinds of meat. These taboos may have been provoked by disgust (pigs are thought to be unclean, raw meat tends to be slimy and unappetizing, and both can cause disease if prepared incorrectly), but disgust can also be perpetuated by taboos. Lebanese Christians are technically allowed to eat pork, but many of them abstain, owing to the influence of their pork-avoidant neighbors in the Muslim-majority country.
Like a regional dialect or a style of dress, most food taboos advertise and affirm membership within a group. Humans evolved in tribes, and food taboos helped to define coalitions. In a Hobbesian past, a cohesive tribe would have had a better chance of domination. Chimps know this just as well as high-school cliques do. A show of strength intimidates the loners—by making them feel like losers. It’s not an accident that minorities with unfamiliar customs can pique our suspicion, Mark Schaller, a social psychologist at the University of British Columbia, told me. Our behavioral immune system, much like our biological immune system, is meant to detect danger. But it can go into overdrive. Schaller compared it to a smoke detector. “It’s designed to be hypersensitive for a reason,” he said. “In the wild, it’s O.K. to make small errors by overestimating a threat, but, if you underestimate, you are dead.”
When I was a child in Chongqing, in the nineteen-eighties, food forged the rules and the language of existence. To be fed was to be loved, and to live was to taste the world. (In Chinese, the character for “life” contains the component word “tongue.”) I grew up on an Army compound—my mother was in the military—and the adults I knew had a habit of pinching the round bums of young children, appraising them as “great juicy cuts of meat for dumplings.” Many of those adults, my father included, had lived through the worst famine in history, during which some villagers had cannibalized one another. When I wondered, at the age of four, if human flesh tasted like pork, it did not occur to me that the thought might be disgusting.
As a young Army recruit, my mother ate the rats that scurried outside the granary she guarded, and for years she ate kernels of rice that she found on the ground—something I was told by other adults never to do. To be the first member of my family spared the pangs of hunger was to live through an epochal transition that felt like cultural transformation. Still, the threat of deprivation hung over our lives like the dangling carcasses in the village wet markets.
At those markets, my mother traded her extra grain coupons—which she began to receive after becoming an Army doctor—for eggs, an expensive protein in the hierarchy of foods. Shortly before I began first grade, my mother stopped feeding me the rice porridge and the pickles that she and my grandmother ate every morning and started me on a special breakfast of what she called “brain foods”: a warm, viscous puddle of milk, bobbing with chunks of raw egg yolk. My Swiss Army knife was already being honed. Disgust welled up in me, but it contended with other blades that were necessary for survival: the shame of ingratitude, and the fear of disobedience. I ate the brain foods every morning for two interminable years.
Even so, disgust did not leave a lasting mark on my psyche until 1992, when, at the age of eight, on a flight to America with my mother, I was served the first non-Chinese meal of my life. In a tinfoil-covered tray was what looked like a pile of dumplings, except that they were square. I picked one up and took a bite, expecting it to be filled with meat, and discovered a gooey, creamy substance inside. Surely this was a dessert. Why else would the squares be swimming in a thick white sauce? I was grossed out, but ate the whole meal, because I had never been permitted to do otherwise. For weeks afterward, the taste festered in my thoughts, goading my gag reflex. Years later, I learned that those curious squares were called cheese ravioli.
Olives were another mystery. In Chongqing, I had been introduced to them as a fig-like snack, dried or cured, that had a sweet-tart kick. In the U.S., I placed a dark-green drop onto my tongue and, for the first time in my life, spat something out of my mouth and into my palm. Salty and greasy weren’t what I was expecting, and my reaction was born as much of disgust as it was of having been deceived.
To be a new immigrant is to be trapped in a disgusting-food museum, confused by the unfamiliar and unsettled by the familiar-looking. The firm, crumbly white blocks that you mistake for tofu are called feta. The vanilla icing that tastes spoiled is served on top of potatoes and is called sour cream. At a certain point, the trickery of food starts to become mundane. Disgusting foods become regulars in the cafeteria, and at the dinner table.
Recently, I joined a few Asian-American friends at a restaurant in Queens to have hot pot, a fondue-like communal meal in which ingredients are dipped in a shared pot of boiling broth at the center of the table. By the time I arrived, bowls of sliced pig arteries, pig intestines, cow stomach, duck feet, and pale-pink brains of unidentified provenance already sat around a burbling vat of broth, spices, and chili oil. All of these would have made it into a Westerner’s encyclopedia of disgusting foods, but everyone at the table knew that the gusto with which we consumed the entrails and viscera connected us.
I asked my companions if they’d had any memorable encounters with disgusting food. Nearly all of them named dairy products that they had tried for the first time in the United States. A Chengdu native recalled the chalky taste of a protein shake, making the classic disgust face as she spoke. “The first time I had pizza was bad,” Alex, a forty-year-old network engineer, said. It was margherita pizza, and he thought that the little white splotches of melted burrata were fresh vomit. “I couldn’t believe that there were people who ate this regularly,” he continued. “But Americans told me this was a very common food here.” He bit into the muscled leg of a bullfrog.
“And I just learned to get used to it.”
I had had almost the exact same experience with a Sicilian slice some three decades before. Assimilating requires you to adopt a foreign tongue, in more ways than one. But when the choice is between annihilation and assimilation, you assimilate. This was as true for prehistoric humans as it is for a young, deracinated Chinese immigrant in America. One of the wonders of the tongue is its sheer malleability. New tastes are acquired and seamlessly incorporated into the tapestry of one’s gastronomic predilections. I don’t remember the exact moment when I began relishing Western olives, but the change felt natural with each new experience, the tapestry is rewoven.
Shortly before my virtual tour of the Disgusting Food Museum, I had received a temperature-controlled package in the mail. It contained goat-stomach cheese, fermented shark, surströmming, and several other items from the museum’s taste test. I arranged the food in small saucers around my laptop and launched Zoom, where Andreas Ahrens was waiting for me. Before I dug in, he suggested I check that the items had made it through their transatlantic journey O.K. “Maybe smell them just to make sure they haven’t gone bad,” he said. But, wait, I said, weren’t most of them supposed to smell bad? He laughed. “Good luck, then.”
I opened a pouch of German sauerkraut juice. Its putrid gray color reminded me of stagnant gutter water. By way of encouragement, Ahrens said, “Very few people try nothing. Most try more than they thought they would.” I had skipped lunch to prepare for the taste test, and by then my stomach was growling so loudly that I felt obliged to apologize to the screen.
“Today’s speaker has written a colorful tell-all memoir about his life as a pollinator.”
The juice tasted cool and refreshing—a blend of pickles and kimchi. Next was bagoong, a Filipino fermented shrimp, which tasted so much like a beloved Chinese fish sauce that I was tempted to spoon it over some leftover rice. Things started getting real with hákarl, the Icelandic shark. My head cocked back at the taste of ammonia, but the chewy texture reminded me pleasantly of squid. I moved on to the insects, beginning with grasshoppers from Oaxaca, Mexico, which had been marinated with dried chilies. They were delicious—crispy, sour, and spicy, like lime-tossed tortilla chips. A bag of dehydrated mixed bugs contained mole crickets and sago worms. The hardest part was knowing that you were eating something that you last saw crawling on the bathroom floor. Crunchiness, I discovered, was a crucial factor in palatability the crickets could have passed for salty granola. The worms, which looked like deformed prunes, were denser and nuttier. Everything tasted considerably better than it looked.
Your Happy Diet Cheat Sheet
Wondering what to nosh on next? Use this checklist of what to eat and to avoid to keep your brain balanced and firing on all cylinders.
Fill Up On
- Oily fish rich in omega-3 fats
- High-antioxidant veggies like dark, leafy greens
- Dark, colorful berries
- Chewy whole grains like brown rice, quinoa, and whole-wheat pasta
Stay Away From
- Fried foods containing saturated and trans fats
- Processed simple carbs like white-flour breads and crackers
- Sweets and candy
- Artificial sweeteners, which some research suggests may negatively affect gut bacteria
Sunny Sea Gold is a health journalist and author of the 2011 book Food: The Good Girl’s Drug.
What Science Says about Snacking
American dining may have evolved from Old World custom into the "three square meals" tradition of the 20th century, but today's consumers are snackers. In fact, over the last four decades, more Americans have traded in meals for snacks.
Between-meal noshing supplies nearly one-quarter of daily calories, earning snacks the status of "fourth meal." What's more, since the late 1970s, daily calorie intake has increased among men and women, with the majority of additional calories consumed between meals. A 2011 report from the U.S. Department of Agriculture claims Americans snack twice as often as they did in the late 1970s, although newer analysis of the data suggests the frequency of snacks has stayed the same while total calories have increased.
These figures have led some experts to ask how snacking affects body weight and other health concerns.
How and Why We Snack
While some people eat between meals because they hold a vague notion that frequent eating is healthful, others report snacking to satisfy cravings for sweet or salty foods, prevent or relieve hunger, boost nutrient intakes, control weight, rev their metabolic rates, pass the time, deal with unsettling emotions or replace meals.
According to a 2014 Nielsen report, 41 percent of North American respondents ate snacks instead of dinner at least once in the previous 30 days. The favorite snacks in North America are chips, chocolate and cheese, according to the report.
Fresh fruit landed fifth in popularity, with 55 percent of survey respondents reporting they ate fresh fruit for a snack at least once in the previous 30 days. A separate study reported that adolescents who snacked most often were the most likely to skip meals. All-day grazing and frequent snacking instead of structured meals and snacks may be side effects of today's on-the-go lifestyle.
Does Snacking Affect Weight?
Snacking may help control appetite, or it may contribute to recreational eating and excess calories. Research supports both opposing views. Beginning in the 1960s, studies noted that people who ate the fewest number of times during the day had the greatest amount of excess body weight, leading many health professionals to recommend frequent eating as a weight-loss tool.
More recently, researchers have challenged the idea that eating frequently aids weight control. A widely recognized problem in diet studies is underreporting of food and calorie intake by some participants. When researchers removed data of people they suspected gave faulty information, the results suggested that the more often someone ate, the higher his or her body mass index would be. Spanish researchers found that people who identified themselves as usual snackers were most likely to gain significant weight during the study's 4½-year follow-up period. Plus, they were nearly 70 percent more likely to become obese.
Among teen girls, eating frequently at the beginning of the study predicted less body fat a decade later. And a study of nearly 2,700 men and women in their 40s and 50s found those who consumed solid food six or more times in 24 hours took in fewer calories and had a lower mean BMI compared to participants who ate solid foods fewer than four times daily.
Conflicting data may be the result of many factors, such as the way researchers defined a snack or eating occasion, whether or not caloric beverages were included in the analyses and underreporting of food, beverage and calorie intake, which can make dietary assessment tools invalid. Reverse causality also may be at play, meaning that some people with higher BMIs may choose to eat less frequently in attempt to lose eight &mdash not that they are heavier because they eat less often.
Though population studies show inconsistent results, randomized intervention trials allowing subjects to choose what they eat generally show no effect on body weight. Of five short-term studies comparing high and low eating frequencies, only one showed a slight advantage when subjects consumed more meals and snacks. Sixteen adults with high cholesterol levels consumed the foods they typically ate, but either as three or nine meals daily for four weeks. Participants eating more often lost an average of 0.9 pounds, while those on the less-frequent meal pattern dropped only 0.2 pounds. In a two-month weight-loss program combining meal replacements and regular food, weight loss was the same whether participants consumed three daily meals or three meals plus a bedtime snack.
Although some dieters snack to boost their metabolic rates, research suggests these efforts are in vain. Studies that examine data for up to 48 hours after eating find that the jump in metabolic rate or the thermic effect of food is not dependent on meal frequency. Rather, overall metabolic rate is similar when a specific amount of food is eaten during few or many occasions. Since frequent eating doesn't appear to burn more calories, researchers looked at the opposite side of the energy balance equation: Does frequent eating cause people to consume fewer calories? One review found a slight benefit to appetite control when eating six meals per day compared to three, and that eating fewer than three meals per day is unfavorable for appetite control.
Both the Evidence Analysis Library of the Academy of Nutrition and Dietetics and experts at a 2009 symposium on eating frequency and energy balance concluded that scientific evidence pointing to an ideal eating frequency for weight control doesn't exist at this time.
Snacking on Other Metabolic Effects
Eating frequency has the potential to affect metabolic parameters other than weight and body fat. In the two-month meal replacement study previously mentioned, there were no differences in cholesterol or triglyceride levels between those eating either three or four times daily.
However, when seven healthy men consumed identical diets as either three daily meals or 17 daily "nibbles" (defined as smaller than a regular snack) for two weeks, cholesterol measurements were better with the nibbling pattern. This study has limitations due to its small sample size, so more research is needed to support the findings.
Additionally, two single-day studies found improvements in blood sugar and lipids when adults with Type 2 diabetes ate more often. But a four-week study among people with Type 2 diabetes found no such advantage when comparing nine small meals to three larger meals and one snack.
Even if long-term benefits were likely, would many people want to eat up to 17 times per day?
Snacking and Diet Quality
Snacks may boost diet quality or lead to excess intakes of solid fats, added sugars and sodium. Although experts debate the health value of snacking, nearly all agree that the type of snack matters. A study of 233 adults in a worksite wellness program found that total snacking calories and frequency of snacking were unrelated to diet quality or BMI. However, the choice of snack foods affected both. The percentage of snacking calories from nuts, fruit and 100-percent fruit juice was related to better diet quality, while percentage of snacking calories from sweets and sugar-sweetened beverages was related to poor diet quality. Eating vegetables as snacks was associated with lower BMI, and eating sweets was associated with higher BMI.
While there is considerable interest in eating frequency, there is no consensus regarding an ideal pattern. It may be that meal and snack quality is more important than frequency of eating and that consumers can benefit from any number of meal patterns. As research into these factors continues, the best pattern may be the one most suitable to a person's individual lifestyle.
Fruits and Vegetables
Fruits and vegetables are two separate groups. The fruit group contains any fresh, canned, frozen or dried fruit, as well as 100 percent fruit juice. The Guidelines recommend you consume 2 cups of fruit if you take in 2,000 calories daily. The vegetable group is divided into those that are dark green red or orange starchy beans and peas, also called legumes and other vegetables. You should consume 2 1/2 cups of vegetables each day, dividing your intake among the five sub-groups over the week. To get the best mix of nutrients, the Harvard School of Public Health recommends choosing a variety of kinds and colors of vegetables and fruits, especially those that are dark green and leafy, or bright yellow, orange or red.
Micronutrients: Water-Soluble Vitamins, Fat-Soluble Vitamins and Minerals
Micronutrients are vitamins and minerals that the body also needs to obtain through diet, but in significantly less quantities compared to macronutrients. In addition to minerals, the body needs two types of vitamins: water-soluble and fat-soluble.
Water-soluble vitamins dissolve in water and include about eight types of vitamin B, as well as vitamin C. Because these vitamins are not stored in the body, they are easily flushed out, so it is important to get enough vitamin B and C through a daily, healthy diet. Vitamin B largely helps stimulate chemical reactions that trigger energy production, while vitamin C supports the immune system. Foods like whole grains, meat, fish, eggs, avocado, carrots, citrus, spinach, bell peppers, almonds and sweet potatoes are good sources for all vitamin B and C needs.
Fat-soluble vitamins do not break down in water and are mostly absorbed when eaten with a source of healthy fat. This describes vitamins A, E, D and K, which contribute to organ function, a strong immune system, blood clotting and strong bone development. Leafy greens, pumpkin, almonds, spinach, dairy, fish and even exposure to sunlight can provide the right quantities of these vitamins.
Minerals like calcium, sodium, magnesium and potassium are some of the most important minerals the body needs, though there are others that are also necessary. These minerals aid muscle function, strengthen bones and help the body to maintain blood pressure and fluid balance. Dairy, broccoli, yogurt, fish, turkey, lentils, bananas, garlic and onions all provide the minerals the body needs.
What Your Genes Want You to Eat
A trip to the diet doc, circa 2013. You prick your finger, draw a little blood and send it, along with a $100 fee, to a consumer genomics lab in California. There, it's passed through a mass spectrometer, where its proteins are analyzed. It is cross-referenced with your DNA profile. A few days later, you get an e-mail message with your recommended diet for the next four weeks. It doesn't look too bad: lots of salmon, spinach, selenium supplements, bread with olive oil. Unsure of just how lucky you ought to feel, you call up a few friends to see what their diets look like. There are plenty of quirks. A Greek co-worker is getting clams, crab, liver and tofu -- a bounty of B vitamins to raise her coenzyme levels. A friend in Chicago, a second-generation Zambian, has been prescribed popcorn, kale, peaches in their own juice and club soda. (This looks a lot like the hypertension-reducing '⟚sh'' diet, which doesn't work for everyone but apparently works for him.) He is allowed some chicken, prepared in a saltless marinade, hold the open flame -- and he gets extra vitamin D because there's not enough sunshine for him at his latitude. (His brother's diet, interestingly enough, is a fair bit different.) Your boss, who seems to have won some sort of genetic lottery, gets to eat plenty of peanut butter, red meat and boutique cheeses.
Nobody is eating exactly what you are. Your diet is uniquely tailored. It is determined by the specific demands of your genetic signature, and it perfectly balances your micronutrient and macronutrient needs. Sick days have become a foggy memory. (Foggy memory itself is now treated with extracts of ginkgo biloba and a cocktail of omega-3 fatty acids.)
''Ultimately, the feedback you'll get will be continuous,'' says Wasyl Malyj, an ''informatics'' scientist at the University of California at Davis working with the new Center of Excellence for Nutritional Genomics, who is helping me blue-sky here. The appeal of this kind of laser-targeted diet intervention is hard to miss. If you turn out to be among the population whose cholesterol count doesn't react much to diet, you'll be able to go ahead and eat those bacon sandwiches. You'll no longer be spending money on vitamin supplements that aren't doing anything for you you'll take only the vitamins you need, in precisely the right doses. And there's a real chance of extending your life -- by postponing the onset of diseases to which you're naturally susceptible -- without having to buy even a single book by Deepak Chopra.
This, then, is the promise -- and the hype -- of nutritional genomics, the second wave of personalized medicine to come rolling out of the Human Genome Project (after pharmacogenomics, or designer drugs). The premise is simple: diet is a big factor in chronic disease, responsible, some say, for a third of most types of cancer. Dietary chemicals change the expression of one's genes and even the genome itself. And -- here's the key -- the influence of diet on health depends on an individual's genetic makeup.
How does that work? Consider what happens, biologically, when we eat a meal. Until quite recently, most scientists thought food had basically one job: it was metabolized to provide energy for the cell. Indeed, that is what happens to most dietary chemicals -- but not all of them. Some of them don't get metabolized at all instead, the moment they're ingested, they peel off and become ligands, molecules that bind to proteins involved in ''turning on'' certain genes to one degree or another. A diet that's particularly out of balance, nutritional-genomics scientists say, will cause gene expressions that nudge us toward chronic illness -- unless a precisely tailored ''intelligent diet'' is employed to restore the equilibrium.
Take genestein, a chemical in soy, which attaches to estrogen receptors and starts regulating genes. Different individuals may have estrogen receptors that react to genestein differently. Genetic variations like that one, some scientists say, help explain why two people can eat exactly the same diet and respond very differently to it -- one maintaining his weight, for example, and the other ballooning.
There is a buzz around nutritional genomics at the moment, which is partly a matter of timing. A sea change is under way in the approach scientists are taking to disease -- they're looking less to nature or nurture alone for answers, and more to the interactive symphony of ''systems biology'' that nutrigenomics epitomizes.
At the same time, chatter around this new science has been amplified by a controversy. The idea of the biological relevance of race -- even its very existence -- is hotly debated. And the assumption of real genetic markers that distinguish one ethnic group from another is at the philosophical heart of nutrigenomics.
Here's the most familiar example: If you're of Northern European ancestry, you can probably digest milk, and if you're Southeast Asian, you probably can't. In most mammals, the gene for lactose tolerance switches off once an animal matures beyond the weaning years. Humans shared that fate as well -- until a mutation in the DNA of an isolated population of Northern Europeans around 10,000 years ago introduced an adaptive tolerance for nutrient-rich milk. The likelihood that you tolerate milk depends on the degree to which you have Northern European blood.
''That, essentially, is the model -- a very dramatic one,'' says Jim Kaput, the founder of NutraGenomics, a biotechnology company. 'ɺs humans evolved, and as our bodies interacted with foods on each of the continents, we sort of self-selected for these naturally occurring variants. And certain populations have variants that, when presented with Western-type food -- which is usually fatty and overprocessed and high in calories -- pushes them toward disease rather than health.''
Plenty of examples bear out this ill fit between certain cultures and certain diets -- suggesting, if not quite proving, some interplay of genes and nutrition: the Japanese who relocated to the United States after World War II soon saw their cholesterol levels soar. The Alaskan Inuit, whose metabolism was perfectly suited to moving around all day, looking for high-fat food, were suddenly saddled with an evolutionary disadvantage when they began living in heated homes and traveling on snowmobiles, and they now show high levels of obesity, diabetes and cardiovascular disease. The Masai of East Africa have developed new health problems since they abandoned their traditional cattle-meat-and-blood-and-milk diet for corn and beans.
The cradle of nutrigenomics is the cradle of humankind itself: the original migration out of Africa created widely separated subpopulations with distinct collections of gene variants. The members of each subpopulation tend to respond similarly to diet and environmental conditions. But the genetics of race is an inexact science. And since many people have ancestors from different continents -- making them a genetic admixture -- the data are rarely clean-cut. In other words, ethnicity is relevant to nutritional genomics, but only as a starting point. Which is why the idea of sorting ourselves by race and pursuing a diet consistent with the original continental diet isn't going to be very helpful. And why, in fact, the customized diets of most people's perfect genomic future will probably not be all that different from one another.
Kaput estimates that the middle 60 percent of the bell curve are probably not going to need to deviate too much from the basic fruit-and-vegetable-heavy diet recommended by the Department of Agriculture. The folks who will benefit from customized nutritional packets, he says, will be the 20 percent at either end: those at the top who don't have to worry much about what they eat -- and will thus be able to cut corners -- and the 20 percent on the bottom, who respond disastrously to conventional diets and will discover that they need to follow special diets or eat specific supplements. The problem for everyone will be figuring out where they fall on the curve of each disease profile.
Just how far in the future are we projecting here? When will nutrigenomics be ready for public consumption? Even many of those who have faith in the science concede that the staggering complexity of interactions among genes, and between genes and the environment, will be a real challenge to solve. As a workable concept, '⟪t right for your genotype'' may be a decade or two -- or more -- down the road.
''Right now, no one in their right mind would offer genetic testing or tell you what drug to take,'' says Dr. Muin Khoury, director of the Office of Genomics and Disease Prevention at the Centers for Disease Control. Despite that warning, a handful of companies are already offering genomics profiles and nutritional supplements to early adopters looking for an edge. One company, the North Carolina-based Great Smokies Diagnostic Laboratory, offers a genetics-testing service called Genovations. Clients pay up to $1,500 for a preventive health profile.
For nutrigenomics to realize its potential, though, vast, ethnically diverse databases of genomic profiles will have to be assembled, from which researchers will try to divine patterns.
But that, of course, opens up a whole new can of genetically modified worms. Once our genotypes are in databanks, can we really be sure they won't be sold to employers or insurance companies? And in what social gulag will those poor saps find themselves who simply cannot resist tucking into a double-cheese all-beef sub during the seventh-inning stretch?