I've long been interested in circadian rhythms and recently came across a variety of articles that appear to indicate that the following neurotransmitters, transferases and hormones are connected. There are mentions of some circadian rhythmicity and maintaining homeostasis for these compounds(Are serotonin levels in humans affected by light?):
- HIOMT, NAT
Additionally, their expression is related to Suprachiasmatic Nucleus and light exposure.
I'm interested if there's a good article or a talk that discusses the homeostatic system that those compounds are a part of. I'm particularly interested in their effect on the feeling of well being.
It seems to me that all these compounds are in some way related to moving and feeding. However, I cannot wrap my mind around multiple separate articles that describe how, and am looking for a single paper that may shed light on the connection between these compounds.
Serotonin and the regulation of mammalian energy balance
Maintenance of energy balance requires regulation of the amount and timing of food intake. Decades of experiments utilizing pharmacological and later genetic manipulations have demonstrated the importance of serotonin signaling in this regulation. Much progress has been made in recent years in understanding how central nervous system (CNS) serotonin systems acting through a diverse array of serotonin receptors impact feeding behavior and metabolism. Particular attention has been paid to mechanisms through which serotonin impacts energy balance pathways within the hypothalamus. How upstream factors relevant to energy balance regulate the release of hypothalamic serotonin is less clear, but work addressing this issue is underway. Generally, investigation into the central serotonergic regulation of energy balance has had a predominantly “hypothalamocentric” focus, yet non-hypothalamic structures that have been implicated in energy balance regulation also receive serotonergic innervation and express multiple subtypes of serotonin receptors. Moreover, there is a growing appreciation of the diverse mechanisms through which peripheral serotonin impacts energy balance regulation. Clearly, the serotonergic regulation of energy balance is a field characterized by both rapid advances and by an extensive and diverse set of central and peripheral mechanisms yet to be delineated.
Feeding is a behavior that ensures an adequate and varied supply of nutritional substrates essential to maintain energy levels for basal metabolism, physical activity, growth, and reproduction and hence, for survival of every living organism on Earth. In the case of mammals, that must maintain a stable body temperature, the maintenance of a high metabolic rate requires constant availability of a sufficient amount of energy stores. The tight balance between energy demand and expenditure is fine-tuned by an adapted dialog between homeostatic and hedonic brain systems that are regulated by peripheral signals involved in feeding behavior and energy homeostasis. Mechanisms for feeding control remain a current and crucial scientific subject for understanding the etiology and potential therapeutic approaches for the treatment of food intake disorders that include obesity, on one hand, and severe forms of anorexia nervosa (AN) on the other.
Voluntary anorexia is a disease not unique to man and has even been described in many vertebrate species that favor migration activity (Wang et al., 2006). In this case, surviving food deprivation involves an adaptation of metabolism, such that internal energy stores available at the onset of fasting are used to maintain basal metabolism and physical activity. The biochemical and physiological adaptations that result from a lack of food help to preserve physiological function in order to maintain behaviors like food seeking or predator avoidance and also, to resume all metabolic processes necessary when food becomes available. However, absolute or long term food deprivation observed in nature or in restrictive AN proceeds in stages in which the individual/organism tries to adapt its metabolism to energy costs but that culminates in death, due to exhaustion of energy stores. As clearly described by Wang et al. (2006), the different stages progress from fasting to starvation, but “The demarcation between these two states is rarely appreciated, perhaps owing to lack of definition. In humans, fasting often refers to abstinence from food, whereas starvation is used for a state of extreme hunger resulting from a prolonged lack of essential nutrients. In other words, starving is a state in which an animal, having depleted energy stores, normally would feed to continue normal physiological processes.” Briefly, three metabolic phases are described during food deprivation (Wang et al., 2006) where energy metabolic adaptations occur to allow supply of fuel in the different parts of the organism, especially the brain (see Table 1). In regard to these metabolic stages, the transition from fasting to starvation occurs by the end of phase II or the beginning of phase III. Thus, voluntary anorexia as seen in restrictive AN should correspond to phases I and II.
Table 1. Different metabolic phases occurring during food restriction and permitting distinction between fasting and starvation (see Wang et al., 2006).
Restrictive AN is a feeding behavior disorder for which severe chronic food restriction causes dramatic physiological and psychological effects that are detrimental for health. AN is most prevalent in women aged of 25 years old or younger (whose BMI reaches values largely below 18.5 kg/m 2 ) and is currently the third largest cause of chronic illness in teenagers (Lucas et al., 1991). The prevalence of AN has drastically increased within recent decades. It leads to central and/or peripheral reprograming that permits the individual/organism to endure a reduced energy supply. These drastic conditions not only induce severe weight loss and metabolic disturbance, but also infertility, osteopenia, and osteoporosis. Moreover, AN is increasingly recognized as an addictive behavior disorder. *-Indeed, many of its common primary characteristics – food obsession coupled with food restriction, weight loss, heightened physical activity, and the strong association with mood disorder (such as anxiety or depression), strongly suggest a potential alteration of the central (dopaminergic) reward system.
Anorexia nervosa patients exhibit significant changes in the release of key hormones involved in energy balance and feeding control (Hasan and Hasan, 2011). For example, the plasma levels of ghrelin, an orexigenic hormone mostly released from the empty stomach, are increased in AN patients along all the day (Germain et al., 2009, 2010). This hormone acts centrally to increase food intake (Wren et al., 2001a,b) and food-motivated behavior (Skibicka et al., 2012), but has also been suggested to be required for the maintenance of blood glucose homeostasis during severe calorie restriction (Zhao et al., 2010). The increases in plasma ghrelin levels in AN seem paradoxical in light of the restrained eating adopted by these patients and suggest an adaptive response to the disease. In regard to the metabolic deficiencies occurring in restrictive AN (see infra), the aim of this review is to highlight the impact of ghrelin in the adaptation of the organism to chronic food restriction until it falls into exhaustion and death. A better understanding of the role of this gastric hormone in dysfunctional AN like feeding behavior is important when evaluating its therapeutic potential for the treatment of AN, envisaged to be used alongside mainstay psychiatric and nutritional therapies.
Where does obesity begin? What drives you to eat too much or expend too little energy, and why has there been such a dramatic increase in obesity since 1980? Some recently popular explanations are the carbohydrate / insulin hypothesis (CIH), singling out the prevalence of carbohydrates in the diet, and the food reward hypothesis (FRH), putting the primary blame on the availability of “hyper-palatable” food.
In this post I will present evidence for new paradigm, which I call the Hypothalamic Hypothesis (HH). I think it provides a better explanation for the facts of obesity than the CIH and FRH theories, and leads to some different advice about how best to lose weight.
Some recent research suggests that obesity starts with specific physical changes to the brain. Appetite is regulated by the hypothalamus, particularly the arcuate nucleus (ARC), ventromedial hypothalamus (VMH) and lateral hypothalamus (LH). It turns out that two very specific changes to the brain cause us to get get hungry, overeat, burn less fat, and gain weight. And these changes to particular brain structures come about as a result of what you eat, eating frequency, and to some extent your activity level. The problem of obesity or overweight is often portrayed as a single problem, but it is really two problems, and each type of obesity corresponds to one type of brain alteration. Failure to distinguish these two types of obesity has resulted in much confusion. In part, the confusion comes about because these two types of obesity frequently occur together in the same individual, although one type is usually dominant. If you understand this, and you understand the role your brain plays, you can become more successful at losing excess weight.
I’ll spend a little time explaining the theory, provide some specific suggestions for how it can help you fine tune your weight loss program, and try to point out why I think the Hypothalamic Hypothesis overcomes some weaknesses of the other obesity theories.
Two types of obesity. One major type of obesity is subcutaneous (SC) obesity. The man on the right is a Sumo wrestler with subcutaneous obesity, but you don’t have to be a wrestler to have this type of fat distribution. It is characterized by lots of looser, softer fat hanging from the torso, arms, legs and even the face. A double chin and skin folds under the arms are not uncommon for this type. SC obesity is more common among women than men.
The second major type of obesity is visceral or “intra-abdominal” (IA) obesity. This is depicted by the classic “beer belly” sported by the main in the left photograph, characterized by a protuberant gut, but frequently not a lot of extra fat on the legs or arms. It’s quite prevalent among men, but seen on many women as well.
The above photos show extreme types, but it is common for both types of obesity to coexist in the same person, in varying degrees. Those with predominant IA obesity are sometimes referred to as “apples” those with predominant SC obesity are called “pears”.
Different metabolisms. The difference between subcutaneous and intra-abdominal obesity is not merely a matter of how adipose tissue is distributed on the body, but also about the biological composition of the fat tissue and it’s metabolic activity. Subcutaneous fat is located just beneath the skin, and on the outside of the muscle tissue, all over the body. By contrast, intra-abdominal fat– also called visceral fat–is located underneath the visceral muscles, deep within the gut. It surrounds the digestive organs — the liver, pancreas, stomach and intestines. The difference can be seen clearly in the CT scans at the left. The top image shows a cross-section at mid-belly level of someone with SC obesity, with most of the dark gray fat mass located right under the skin but outside the lighter grey visceral muscles and internal organs. The bottom image is a similar CT scan of someone with IA obesity, showing much less subcutaneous fat, but considerable fat beneath the walls of the viscera, packed around the intestines.
What is important to realize is that the adipose tissue stored inside the abdomen is biochemically and metabolically very different than the fat stored right under the skin. Both are called “fat” or “adipose tissue” but they behave as if they were entirely different substances. The image below at left is a micrograph of SC fat the image at right shows IA fat cells. Notice the different shape and size, but also the substantial dark “mortar” between the IA fat cell “bricks”.
The adipose tissue in IA fat is not an inert storage tissue. On the contrary, it is a metabolically active hormonal “organ”: it is infiltrated by macrophages and secretes “adipokines” like interleukin-6, tumor necrosis factor alpha, and C-reactive protein. These compounds are inflammatory signaling agents, associated with insulin resistance, diabetes, hypertension, and cardiovascular disease characteristic of Metabolic Syndrome. The health effects of this inflammatory process have been the subject of intense study. In this article, however, I’ll address only the role that these inflammatory processes have in the development of obesity.
The appetite center. To understand the dynamics of each type of obesity, it is important to understand how appetite and body fat are governed by the brain. The hypothalamus regulates biological drives, including feeding, sleep and hunger. As shown in the diagram at right (and also in this video) appetite, feeding behavior and metabolic rate are regulated by two sets of neurons that have opposite effects on appetite and metabolism:
- The “anorexigenic” POMC/CART neuronsthat inhibit appetite and increase the rate of fat oxidation in the body. In response to nutrients and certain hormones, these neurons produce the appetite-suppressing neuropeptides propio-melanocortin, cocaine-and-amphetamine-regulated transcript and α-melanocyte stimulating hormone (α-MSH). The α-MSH binds to and activates secondary melanocortin-4 (MC-4) neurons in the ventromedial hypothalamus (VHM), causing satiety and increasing energy expenditure and fat oxidation in the body. Animals with damaged or lesioned POMC/CART neurons eat voraciously and become obese. Both leptin and insulin are potent hormonal stimulators of the POMC/CART neurons. These neurons have receptors for appetite suppressing signals like insulin and leptin low levels of either hormone will increase appetite and reduce metabolic rate. If a deficiency of leptin or insulin persists, it will lead to obesity.
- The “orexigenic” NPY/AgRP neurons that stimulate appetite and slow down fat oxidation in the body. These neurons produce two neuropeptides — neuropeptide Y (NPY) and agouti-related protein (AgRP) which act to inhibit α-MSH from binding to and activating the MC-4 satiety neurons and stimulates melanin-concentrating hormone (MCH) in the lateral hypothalamus (LH). This inhibition of MC-4 and stimulation of MCH enhances appetite and decreases metabolism and energy expenditure, conserving fat. Animals in which the NPY/AgRP neurons have been damaged or destroyed by lesions become anorexic and lose weight. Insulin and leptin inhibit the NPY/AgRP neurons, whereas the “meal timing” hormone ghrelin, which cyclically ebbs and flows, stimulates them.
These two sets of neurons govern fat gain and fat loss. They effectively sense the energy status of body by centrally integrating inputs from a large number of circulating nutrients, neuropeptides and hormones, and they respond by outputting neuropeptides that drive behavior and peripheral metabolism. When they are in balance, a normal and healthful level of body fat is maintained, but when the balance of orexigenic or anorexigenic signals shift, this adjusts the body’s fat and activity set points up or down. As a prime example, if leptin levels in the hypothalamus are low, either because of low body weight or because the leptin is blocked from reaching its receptors in the POMC neurons, appetite will increase, fat oxidation will decrease, and this will lead to an increase in adiposity.
Insulin, leptin and appetite. There are two hormones which predominantly regulate body fat: insulin and leptin. In healthy individuals, as Byron Richards describes,
Leptin uses adrenaline as a communication signal to fat cells, telling them to release stored fat to be used for fuel. This takes place in the course of a normal day between meals and at night during sleep…A drop in leptin signals hunger. Food intake stimulates insulin release. As a person eats, insulin is always directing some amount of triglycerides to go over to white adipose tissue and enter fat cells….This turns on the production of leptin in fat cells, causing the blood level to rise in response to the meal. As the leptin levels rise high enough, they signal to the brain that enough has been eaten. Leptin now signals the pancreas to stop making insulin…In overweight people, the communications involving insulin and leptin are inefficient. It is like making a phone call where no one answers. Insulin and leptin resistance mean that the hormones don’t communicate efficiently in response to food.” (The Leptin Diet, p. 13, 17, 23, 36)
Increased basal levels of either of these two hormones indicates increased energy stores and adiposity. The hormones have different metabolic effects depending on their site of action. As Lustig explains, the action of these hormones “centrally” — inside the brain — is entirely different than that in the “periphery” — the rest of the body:
Insulin also plays a pivotal role in the control of appetite and feeding. In addition to its well-defined peripheral role in glucose clearance and utilization, insulin is involved in the afferent (and efferent) hypothalamic pathways governing energy intake, and in the limbic system’s control of pleasurable responses to food. Whereas insulin drives the accumulation of energy stores in liver, fat, and muscle, its role in the CNS tends to decrease energy intake. This is not a paradox, but rather an elegant instance of negative feedback. When energy stores abound, circulating insulin tends to be high high CNS insulin tends to decrease feeding behaviors, thereby curtailing further accumulation of energy stores. Insulin’s central effects on energy intake are manifested in two complementary ways: first, insulin decreases the drive to eat second, insulin decreases the pleasurable and motivating aspects of food.
This self-limiting regulatory action of insulin is also noted by Banks:
Insulin plays many roles within the CNS. Several laboratories have shown that some of the CNS effects of insulin are the opposite of those effects mediated through peripheral tissues. In particular, CNS insulin increases glucose and inhibits feeding, whereas serum insulin decreases glucose and increases feeding. Thus, to some extent, insulin acts as its own counterregulatory hormone, with CNS insulin producing features of insulin resistance.
Both insulin and leptin have an appetite suppressing effect when an elevated level of either one reaches the appetite center of the brain, specifically the satiety-inducing POMC/CART neurons within the arcuate nucleus (ARC) of the hypothalamus. While similar in their appetite suppressing effect, insulin levels fluctuate in response to the ingestion of meals, especially carbohydrate-rich meals, whereas leptin levels generally reflects longer term changes in energy stores. Most noteworthy for this discussion, however, these two hormones reflect the two different types of fat. According to Woods et al:
Insulin is secreted in proportion to visceral fat, whereas leptin reflects total fat mass and especially subcutaneous fat. This is an important distinction with regard to the message conveyed to the brain, since visceral fat carries a greater risk factor for the metabolic complications associated with obesity than does subcutaneous fat. Elevated visceral fat carries an increased risk for insulin resistance, type 2 diabetes, hypertension, cardiovascular disease, and certain cancers. Hence, leptin and insulin each convey specific information to the brain regarding the distribution of fat, and the combination of the two additionally conveys information as to the total fat mass of the body.
Interestingly, Woods also reports the brains of females are more sensitive to leptin than insulin, whereas the reverse is true in males, and that estrogen mediates this difference. According to Cnop el at., women on average have three times as much leptin as men, even after controlling for comparable degrees of body mass and insulin resistance. Which explains why there are more male “apples” and more female “pears” — though of course both types of obesity are represented to varying degrees in both genders.
While the appetite regulating actions of insulin and leptin within the brain are well known, what is less well known is that these the two hormones also use “remote control” from within the brain to activate fat loss in the rest of the body. According to Woods:
As previously mentioned, when leptin is administered into the brains of experimental animals, there is a selective reduction of body fat, with lean body mass being spared. Likewise, when insulin is administered into the brain, there is a reduction of the respiratory quotient, suggesting that the body is oxidizing relatively more fat. These observations suggest that one action of these adipose signals within the brain is to reduce body fat, and a corollary of this is that fat ingestion would be expected to be reduced as well. Consistent with this, we have observed that when insulin is administered into the third cerebral ventricle of rats, fat intake is selectively reduced. Hence, it is reasonable to hypothesize that leptin and insulin, acting in the brain, reduce body fat by increasing lipid mobilization and oxidation and simultaneously by reducing the consumption of dietary fat.
In short, if you want to control your appetite and burn fat faster, you want leptin and insulin to get inside your brain!The problem in obesity is that these hormones are not adequately reaching and communicating with the appetite center of the hypothalamus.
Putting up resistance. So far, I’ve described how leptin and insulin work to homeostatically regulate appetite and body fat in normal individuals. But this carefully balanced feedback system becomea derailed in obesity. There are some interesting, but fortunately rare, genetic or disease conditions where the leptin or insulin sensitive receptors in the hypothalamus become defective and insensitive to leptin or insulin. In other words, the “off” switch for appetite stops working correctly. Or where the leptin or insulin molecules themselves are mutated or damaged and are thus unable to turn off the appetite switch. Animals or humans with these defects eat voraciously, insatiably and become extremely obese. These rare cases provided some of the initial evidence for the current understanding of how leptin and insulin regulate appetite and body weight.
But defective hormones and receptors are rare and do not explain the vast majority of cases of obesity. The “normal” cause of obesity involves involves leptin resistance or hypothalamic insulin resistance, whereby there is plenty of leptin or insulin circulating in the bloodstream, and the appetite-suppressing POMC neurons are functional, but not all of the hormone is reaching the receptors in the hypothalamus. The messenger is yelling, but the ears hear the message faintly. There is a barrier or impediment between messenger and receiver. The result in each case is that appetite is not getting satisfied, so there is a drive to overeat. And furthermore, as Woods notes, the “remote control” fat burning functions of the hypothalamus are also reduced. As a result, with more eating and less fat mobilization and oxidation, you get fatter.
Now, let’s see in more detail what happens to the hypothalamus in each main type of obesity.
Subcutaneous (SC) obesity and the brain. Leptin is produced in adipose tissue, but specifically in SC fat. The more SC fat, the more elevated the leptin concentration in the blood. Normally this would provide a negative feedback signal, inducing satiety in the hypothalamus and increasing the release of fatty acids from fat cells. In SC obesity, however, only a low level of this leptin is reaching the hypothalamus, so appetite and eating are not inhibited. But why does this happen? What is the mechanism?
Some, like Lustig, see insulin resistance in the brain as a likely driver of leptin resistance:
Hyperinsulinemia itself may be a cause of leptin resistance. As described, insulin and leptin use many of the same neurons, the same second messengers, and the same distal efferents to effect induction of satiety….Although confirmation in animal studies is needed…CNS insulin resistance may be a proximate cause of leptin resistance, promoting continued weight gain.
However, it is not plausible to blame leptin resistance on insulin resistance, because many of the obese are insulin sensitive. For example, Sumo wrestlers notably can weigh 500 pounds or more, but they are typically insulin sensitive, and have low cholesterol. According to an study by Gerald Reaven of Stanford:
The ability of insulin to mediate glucose disposal varies more than six-fold in an apparently healthy population, and approximately one third of the most insulin-resistant of these individuals are at increased risk to develop cardiovascular disease. Differences in degree of adiposity account for approximately 25% of this variability, and another 25% varies as a function of level of physical fitness. The more overweight/obese the person, the more likely they are to be insulin-resistant and at increased risk of cardiovascular disease, but substantial numbers of overweight/obese individuals remain insulin-sensitive, and not all insulin-resistant persons are obese.
Recent evidence suggests that the crux of leptin resistance can be located at the door to the brain: the blood-brain barrier (BBB). The BBB is semipermeable along the arcuate nucleus. This allows for controlled, selective transport of various nutrients and energy signals. According to Banks,
The blood–brain barrier (BBB) prevents the unrestricted movement of peptides and proteins between the brain and blood. However, some peptides and regulatory proteins can cross the BBB by saturable and non-saturable mechanisms. Leptin and insulin each cross the BBB by their own transporters. Impaired transport of leptin occurs in obesity and accounts for peripheral resistance that is, the condition wherein an obese animal loses weight when given leptin directly into the brain but not when given leptin peripherally. Leptin transport is also inhibited in starvation and by hypertriglyceridemia. Since hypertriglyceridemia occurs in both starvation and obesity, we have postulated that the peripheral resistance induced by hypertriglyceridemia may have evolved as an adaptive mechanism in response to starvation.
In a study on mice, Banks et al. showed that triglycerides, but not free fatty acids, induce leptin resistance. This same study showed that, that fasting for 16 hours reduced triglycerides and increased leptin transport, whereas fasting for 48 hours increased triglycerides and impaired leptin transport. This provides support for intermittent fasting as a strategy to reverse leptin resistance. Elevated triglycerides also enhance the transport of ghrelin, the hormone responsible for initiating feeding at conditioned meal times, which explains why certain obese people get especially hungry around meal time.
Triglyceride levels tend to increase with your degree of adiposity. But what causes them to rise in the first place? The primary culprit seems to be fructose, which is converted to triglycerides if consumed in excess. Of course, fructose is part of sucrose and high fructose corn syrup, so any of these sugars in excess will elevate triglycerides, cause leptin resistance, and SC obesity. Foods containing high concentrations of sugar include sodas, candies, breakfast cereal, bread and other baked goods, but also sugary fruits like bananas, mangos and raisins. Michael Eades recognized the connection between triglycerides, the blood brain barrier and appetite in his 2007 blog post “Leptin, low-carb and hunger“. But I suspect that it is specifically the effect of fructose reduction — and not the generalized carbohydrate reduction postulated by Eades– that is the primary explanation for low carb diets work to reduce appetite so well for many people.
Diet, of course, is not the only factor affecting how the blood-brain barrier affect leptin resistance. For example, Banks also notes that epinephrine enhances leptin transport across the BBB by a factor of 2-3 fold. This explains why exercise and excitement can act to suppress appetite.
Intra-abdominal (IA) obesity and the brain. Insulin is produced by the pancreas, when it circulates through most of the body outside the brain and spinal cord — what physiologists call the “periphery” — it’s main function is to regulate the availability and storage of glucose and fatty acids, thus preventing excessive glucose or fatty acid levels in the bloodstream. When insulin receptors in liver, muscle, and other tissues become less responsive to insulin, the resulting insulin resistance results in hyperinsulinemia and its associated metabolic derangements such as Type 2 diabetes. There has been much investigation regarding what causes insulin resistance, the lead hypothesis being some sort of inflammation due to many suspects, including certain fats.
Unlike leptin, triglycerides do not impair insulin transport into the brain. According to a study by Urama and Banks,
So what, if not triglycerides, leads to insulin resistance in the brain?
The answer appears to be: free fatty acids. Certain fatty acids – trans fats, certain long-chain saturated fatty acids, and omega-6 unsaturated fatty acids — produce an inflammatory response in insulin receptors that blunts insulin sensitivity. By contrast, other fatty acids — principally omega-3 fatty acids (like flax or fish oil) and short or medium chain triglycerides (like coconut oil) — are actually anti-inflammatory). Certain sugars like fructose also appear to be pro-inflammatory. But what has not been recognized until recently is that these inflammatory processes occur not just in the liver and muscles, but also within the hypothalamus.
And in fact, inflammation of the hypothalamus may be where insulin resistance starts.
Posey et al found that mice fed a high fat diet, with equal calories to a low fat diet, gained 60% more adipose tissue than those on the low fat diet. Other experiments by Kaivala et al. showed a high fat diet resulted in a 60% reduction in CNS insulin levels, inversely associated with changes in body weight. Thaler et al. , Schwartz et al and Benoit et al. showed that one particular long chain saturated fatty acid — palmitic acid — causes inflammation and reduces insulin sensitivity in the hypothalamus, leading to overeating and obesity. Arruda et al. found that intracerebroventricular injection of an inflammatory cytokine (TNF-α) or stearic acid (another long chain saturated fatty acid) into lean rats induced insulin and leptin resistance in the hypothalamus and hyperinsulinemia and down regulated thermogenesis and oxygen utilization. In TNF knockout rats (those missing the TNF receptor), the TNF-α did not produce any of these effects, and the rats were protected. Furthermore, Araujo et al showed that co-administrering an anti-inflammatory drug (infliximab) restored normal oxygen consumption in the obese rats. Similar results from other studies have been reviewed by Schwartz et al .
Interestingly, high levels of fructose can also cause inflammation and insulin resistance, leading to IA obesity. If you are lean and healthy, fructose at reasonable levels is converted to glucose in the liver, and brief excess is then stored as glycogen in the liver and muscles. But in vast excess, fructose is converted to fat of two types — triglycerides and one particular fatty acid. Can you guess which fatty acid? The answer is palmitic acid, the fatty acid associated with brain insulin resistance. The liver begins to accumulate the excess fat – a condition known as steatosis or fatty liver disease — which results in hepatic insulin resistance. So while high fructose consumption causes elevated triglycerides, those triglycerides cause leptin resistance and are not a direct cause of insulin resistance. do not cause insulin resistance, only So it looks like fructose (and of course sucrose which is 50% fructose) is involved in the genesis of both SC obesity and IA obesity. The fact is just one manifestation of how easy it is to get confused about “the cause” of obesity. Because there are two types of obesity with different but intertwined etiologies, the logic of obesity is not always so easy to sort out. But the various diveres causal threads always come together in the arcuate nucleus of the hypothalamus
What is most illuminating, however, is research by Ono et al showing that hypothalamic insulin resistance precedes — and probably causes — insulin resistance in other organs and tissues. Ono found that feeding rats a high fat diet induced insulin resistance in the hypothalamus after only one day, with no concurrent hepatic insulin resistance! It took a full 3 days on this diet for insulin resistance to show up in the liver, and 7 days for the muscles and peripheral tissues to become insulin resistant. The mechanism of inflammation was the activation of the mTOR/S6K pathway by exposure to fatty acids. The S6K protein apparently inhibits insulin signaling in the arcuate nucleus of the hypothalamus, activating the orexigenic NPY/ArGP neurons and inhibiting the POMC neurons. Similarly, Pagotta has marshalled other evidence suggesting that insulin resistance starts in the brain. Of particular note is a study by Obici et al, in which central administration of insulin suppressed glucose production by the liver, and blocking insulin signaling in the brain impaired the ability of insulin to inhibit glucose production in the liver. Finally, an excellent post by Stephan Guyenet cites a similar study by Morton and Schwarz showing much the same thing. As Guyenet notes,
Investigators showed that by inhibiting insulin signaling in the brains of mice, they could diminish insulin’s ability to suppress liver glucose production by 20%, and its ability to promote glucose uptake by muscle tissue by 59%. In other words, the majority of insulin’s ability to cause muscle to take up glucose is mediated by its effect on the brain.
In regard to insulin signalling, the brain seems to be in charge of the liver. And this plays out in the genesis of insulin resistance.
This raises an interesting question: why would insulin resistance start in the brain, rather than the liver or the muscles? When you think about it for a few minutes, it actually makes sense. The hypothalamus is the ultimate arbiter of whether or not the body has adequate energy intake. It does this by homeostatically regulating energy stores and energy balancing hormones. In the case of leptin resistance, as we’ve already seen, the brain acts to restore homeostasis signaling the peripheral metabolism to “grow” more subcutaneous fat (by increasing appetite and slowing fat oxidation). If insulin signaling in the brain is blocked or impaired, homeostasis requires the initiation of compensatory processes that will bring more insulin into the brain. But how to do that? Insulin is not produced in the fat cells, so growing more fat won’t directly help. To do this, the periphery must become somehow become hyperinsulinemic, in order to overcompensate so that enough insulin gets into the hypothalamus. And the best mechanism for this is to induce whole body insulin resistance, primarily in the liver and muscles.
But how does the insulin resistant brain orchestrate insulin resistance in the periphery? The answer, apparently, is to grow intra-abdominal fat. As Ljung notes, hypothalamic insulin resistance disrupts the hypothalamic-pituitary -adrenal axis (HPA), leading to increased secretion of ACTH and cortisol. These hormones in turn stimulate the growth of intra-abdominal adipocytes. The IA fat proliferates macrophages and releases pro-inflammatory fatty acids and “adipokines” into the bloodstream. (See “Intra-Abodominal Adipose Tissue: The Culprit?“) The portal circulation carries these to the liver where they promote steatosis (fatty liver), insulin resistance, and local inflammation. The systemic circulation further carries these fatty acids and proinflammatory molecules to skeletal muscle where they promote lipid accumulation, insulin resistance, and local inflammation. As Ross showed, it is IA fat, not total fat or SC fat, that is associated with whole body insulin resistance. Insulin resistance in the body causes the pancreas to go into overdrive to supply more insulin, resulting in hyperinsulimia. As basal insulin levels increase, the hypothalamus is now getting its fix of insulin, keeping hunger in check. Of course, the level of IA obesity and hyperinsulimeia will only be what is required to handle the degree of inflammation experienced by the arcuate nucleus in the brain. One this inflammation is reduced or removed, and the NPY/AgRP neurons become more sensitive to insulin, the requirement for elevated basal insulin should go down, and with it the need for intra-abodominal fat.
In slogan form, here is the Hypothalamic Hypothesis of Obesity:
Now for some practical advice: How can you use the Hypothalamic Hypothesis to lose unwanted fat or better control your weight?
1. Start by assessing your degree and type of adiposity. Do you have a waist-to-hip ratio greater than 0.8 (women) or 1.0 (men) and carry your extra weight a belly that sticks out in front? That’s IA fat and you are a probably an “apple”. Or do you have a waist-to-hip ratio of less than 0.8 (for women) or 1.0 (for men) and carry most of your extra weight on your butt, your thighs, chest, and possibly also your arms and neck? That’s SC fat and you are probably a “pear”. Of course, you may be an “apple-pear” and carry extra fat in both locations, but it is good to know which type of fat is dominant. If you want a much more precise assessment using specific measurements of body weight, height and other body dimensions, I recommend consulting “Assessing Your Risk”, Chapter 9 in Protein Power, by Eades and Eades.
2. If you are primarily a “pear”, and particularly if you are significantly overweight, you are leptin-resistant. Your primary focus should be on reducing triglycerides. Largely, this means cutting back on carbohydrates with fructose or sucrose (which is a disaccharide of fructose attached to glucose) is readily converted to triglycerides by the liver. And it is triglycerides that primarily induce leptin-resistant SC obesity. So of course you want to cut out soft drinks, cookies, cakes, ice cream, candies, most fruits, and most breads (except those with no sugar, which are hard to find). But so long as you are reasonably insulin sensitive, you don’t have to cut out starches. Potatoes and rice are probably fine if you are insulin-sensitive as long as you avoid any sugar in the same meal. If you are an “apple-pear” and are resistant to both leptin and insulin, then you can still eat fructose-free starches like potatoes and starch, but you must not add any pro-inflammatory fats. The question of what constitutes a “pro-inflammatory fat” is a controversial one. Some fats, such as trans fats and high levels of omega-6 fats are clearly pro-inflammatory, while omega-3 fats, mono-unsaturates like olive oil, and medium chain triglycerides like coconut oil are anti-inflammatory. But for saturated fats, the picture is less clear and the studies are all over the place. Probably some saturated fats are OK. But some people have found that cutting back on cheese and nuts help them shed abdominal fat. Milk and butter from grass fed cows may be preferable to that from grain fed cows.
What about alcohol? Alcohol is frequently assumed to raise triglyceride levels, but observational studies show this may not necessarily not true. Moderate alcohol may actually reduce triglyceride levels.
Finally, as the Banks’ fasting study suggests, intermittent fasting (16 hours, but not 48 hours) can reduce triglycerides and restore leptin sensitivity.
3. If you are primarily an “apple”, pre diabetic, or trying to lose stubborn belly fat — the last 10-20 pounds, your primary focus should be on eating a non-inflammatory diet. For the most part, this means cutting back on certain fats — trans fats (anything “partially hydrogenated” on the nutrition label), vegetable fats high in omega-6 oils, and perhaps certain saturated fats like those in meat, milk, butter or cheese from grain-fed cows. As mentioned above, the question of which saturated fats are “pro-inflammatory” is controversial. The strongest evidence that connects saturated fatty acids to brain insulin resistance is for palmitic acid, but that does not mean all saturated fatty acids cause insulin resistance. In any case, don’t shun non-inflammatory fats like fish oil, olive oil, or coconut oil. Adding these to your meals can help reverse IA obesity. I’ve personally found coconut oil to be great for energy and weight loss.
Because consuming high levels of sugar in the diet (fructose, sucrose or syrups that contain them) causes output of pro-inflammatory palmitic acid, foods containing sugar should be restricted. If you are lean and have a have a healthy liver, I see nothing wrong with fructose in moderate quantitates. The daily apple will not hurt you, but the excessive amounts of sugar in sodas, pastries, ice cream, bread (which contains sugar) sweet fruit — make you (or maintain you as) both a “pear” and an”apple”.
In addition to avoiding high levels of certain fatty acids and sugars, inflammation can also be reversed by a few additional steps:
- ensuring adequate intake anti-inflammatory micronutrients such as Vitamin D and magnesium
- high intensity exercise, intermittent fasting, cold showers and other hormetic stressors which upregulate anti-inflammatory brain compounds such as BDNF
Caveats. In making the above suggestions, I would like to make a disclaimer: This post is primarily about a new paradigm of obesity, but I realize that people are looking for specific dietary recommendations. The above dietary advice is based upon my best attempt to interpret two general principles regarding the effects of triglycerides and inflammation on the appetite center of the hypothalamus. In doing this, I am relying on a large body of empirical evidence that is sometimes ambiguous or contradictory — for example, regarding which saturated fats are pro-inflammatory, and which are protective. And so I may be wrong about the hypothalamic effect of this or that specific food. Despite this uncertainly, the HH provides a test for deciding whether a food or practice is obesogenic and leads to overeating: namely whether it raises triglycerides or inflames the hypothalamus. And it is also apparent that these guidelines for foods to avoid cut across conventional macronutrient categories like “fat” and “carbohydrate”, since the hypothalamus does not sort things out that way.
OTHER THEORIES OF OBESITY. I would like to close by contrasting the Hypothalamic Hypothesis with two other theories of obesity, showing how it better accounts for certain facts, and leads to perhaps some different recommendations for losing excess body fat.
The Carbohydrate / Insulin Hypothesis (CIH). Most prominently advocated by Gary Taubes, CIH holds that dietary fat plays no role in obesity. Rather, dietary carbohydrates, through their stimulation of insulin secretion, result in a greater degree of fat storage. Carbohydrates drive insulin drives net fat storage. Obesity is a disorder of excess fat accumulation, not overeating or inadequate energy expenditure. In its favor, CIH can account for the close correlation between obesity and hyperinsulinemia, and the success of low carb dieting. However, it manifestly does not explain why many obese people, like Sumo wrestlers, are insulin sensitive, with normal insulin levels and no indications of diabetes, cardiovascular disease, or other signs of Metabolic Syndrome. It also does not account for why others, such as the Kitavans and Okinawans, can eat a diet low in fat but high in certain starchy carbohydrates (polymers of glucose) like root vegetables or rice, and remain lean, with low basal insulin levels. And it cannot explain why, despite sincere attempts, many people can lose only a certain amount of weight (probably subcutaneous fat) on low carb diets, but often stall and remain insulin resistant when continuing to eat a high fat / low carb diet. The HH can explain all these facts by carefully distinguishing SC obesity from IA obesity, and by narrowing the cause of type of obesity to very specific types of carbohydrate (fructose and sucrose) and fat (long chain saturates, trans fats and omega-6 fats). And, perhaps heretically, HH predicts that once you’ve maxed out the benefits of low carb, you can get rid of that paunch and insulin resistance by cutting back on fats– at least the pro-inflammatory fats.
The CIH also cannot explain certain anomalies such that described by Stephan Guyenet and Chris Masterjohn: the LIRKO mouse which has severe hepatic insulin resistance and hyperinsulinemia — but remains leaner than its normal counterparts. Guyenet and Masterjohn seem to conclude from this that insulin resistance cannot be a cause of obesity. The mistake they make, I believe, is overlooking the possibility that only one type of insulin resistance — that of the hypothalamus — leads to obesity. The LIRKO mouse they discuss had an insulin resistant liver, but apparently a well functioning hypothalamus. It would have been interesting to feed it some pro-inflammatory fats to see what would happen.
One further aside about the CIH: I must admit that I was previously persuaded by the orthodox version of CIH and it’s explanation about hunger–which I now suspect is incorrect. I employed this theory elsewhere in this blog to explain the appetite-suppressing effect of low carb diets, intermittent fasting, and flavor control diets such as the Shangri-La Diet. The explanation was based on what I thought was a very plausible theory I first encountered in Gary Taubes’ Good Calories, Bad Calories, Chapter 24,”Hunger and Satiety.” . The insulin-lowering effect of low carb diets is supposed to counteract hunger from hypoglycemia by making glucose and free fatty acids more available. And the appetite inducing effects of appetitive flavors or aromas is explained by their action (probably via the vagus nerve, mediated by the brain’s tractus solitarus) in eliciting a preprandial insulin response. This preprandial insulin response supposedly causes a sudden drop in blood glucose, inducing hunger. I now believe this theory is wrong, or at least incomplete, for several reasons. Primary among those reasons are my own experience with blood glucose self monitoring, where I noticed that my blood glucose would typically drop after, but not before I would get hungry. Also, preprandial insulin responses are typically fairly small and unlikely to reduce blood sugar enough to induce hypoglycemic hunger. So the preprandial insulin response seems too little, too late. It is more likely an effect, not a cause, of hunger. I now suspect that a more likely explanation would be the direct action of the vagus nerve and tractus solitarus on the orexigenic or anorexigenic neurons in the ARC, or on the permeability of the blood brain barrier. But that will be a topic for another post.
The Food Reward Hypothesis (FRH). The most effective advocate for the FRH is Stephan Guyenet, of Whole Health Source. Guyenet is the first to admit he is not the originator of this theory, which is common among obesity researchers and was prominently featured in David Kessler’s book, The End of Overeating. And Stephan also takes a modest stance in stipulating that he takes “food reward” to a be a major explanatory factor, but not the sole causal factor, for obesity. For example, he mentions exercise, leptin resistance, energy excess and, yes, even hypothalamic inflammation, as “other” contributory causes to obesity. So FRH is not supposed to be a monocausal theory of obesity. But modesty aside, Guyenet has put a stake in the ground and marshaled considerable argument and evidence in support of FRH. Briefly, FRH holds that feeding people (or animals) foods have a high “reward” level results in overeating and obesity. Here is how Guyenet defines “food reward”:
I use the term food reward to refer specifically to the motivational value of food, i.e. its ability to reinforce behavior. For example, acquiring a taste that causes a person to seek out the food in question more often. This is how some, but not all, researchers define the term. Others use the term “food reward” to refer to both the motivational and the palatability value of food. Palatability refers specifically to the enjoyment derived from a food, also called its hedonic value. Palatability and reward typically travel together, but not always. (“The Case for Food Reward,” Oct, 1, 2011)
The theory is supported by experimental evidence, for example by the rapid weight gain seen with rats switched from ordinary chow to a high fat, high sugar “cafeteria diet”, and further developed by referring to the effects of such diets on brain opioids, dopamine circuits and other neurochemistry. Guyenet goes on to propose a remedy for the abundance of super palatable food: just say no. By avoiding overly rewarding food, our brains can return to sane eating and obesity can be avoided or reversed.
I feel a certain affinity for the FRH theory because, like HH, it is a “brain-centric” theory of obesity. Guyenet’s self-described field of research is “neurobiology of body fat regulation and obesity”, which I agree is the most promising way to study of obesity. I’ve been excited to follow his cogent summaries of the most interesting research in this field. However, the FRH seems to have incorrectly formulated the connection between the brain and obesity. In fact, I’ve already discussed the FRH theory in another post, “ Does tasty food make us fat? “. Here is what I wrote there:
But I think the theory is wrong, for the simple reason that it too blindly takes correlation for causation. And in doing so, it gets the causal direction mostly wrong. We don’t get fat because food has become too tasty. Rather, to a large extent, it is the metabolism and dietary habits of the obese that make food taste too good to resist, leading to insatiable appetites. And the good news is that we are not consigned to blandness. If we eat and exercise sensibly, we can eat flavorful, delicious foods and enjoy life, without packing on the pounds.
I had not formulated the HH theory when I wrote that post, but it fits the bill of what I said there: it is the metabolic effects of the pertinent foods in “cafeteria” diets that make them “rewarding” and engender the secondary effects on pleasure-related neurotransmitters like beta endorphin, dopamine or serotonin. What HH does is to more specifically locate the primary metabolic effects within the arcuate nucleus of the hypothalamus, rather than elsewhere in the body.
I think that HH can explain a number of things that FRH cannot. FRH is a somewhat vague in that it does not go very far to identify what specific attributes of food make them rewarding and what specific mechanism are involved. Somehow, sugar, fat and salt are involved. It is more like a schema than a full theory, which makes it hard to test or criticize. By contrast, HH is very specific about the mechanisms by which specific food chemistries interact with specific parts of the brain. HH, unlike FRH, provides an explanation for why certain “rewarding” foods will eventually lead to either subcutaneous obesity or rather intra-abdominal obesity. HH holds that if you are neither leptin resistant or insulin resistant, then no foods will be inherently hyper-rewarding, at least initially. Foods only become hyper-rewarding once insulin or leptin resistance begins to manifest itself. HH makes the further prediction that very tasty, palatable foods that contain no fructose or sucrose (or other agents that elevate triglycerides) or pro-inflammatory fats, will not lead to obesity, no matter how good they taste.
A wider perspective: The homeostatic pleasure principle. Finally, I think that the Hypothalamic Hypothesis provides a way to connect the hormonal regulation of obesity to something overlooked by both CIH and FRH: the role of emotion and cognition in obesity, and the relation of obesity to our wider sense of well being. Obesity is often a response to emotional factors like stress and depression, and conversely might be reversed by cognitive techniques such as cognitive reframing and meditation. By locating the original of obesity within the hypothalamus, it becomes plausible to understand how stress hormones like cortisol and or calming neurotransmitters like serotonin can have a powerful and direct effect on the behavior of hypothalamic neurons and their sensitivity to leptin and insulin, since these neurochemicals are lurking nearby within the “neighborhood” of the brain. Looked at more broadly, the hypothalamus can be thought of as a homeostatic regulation system that attempts to maintain an internal subjective sense of well-being or pleasure with respect to a broad range of drives, including not just eating, but sleep, sex, aggression, fear and other emotions. This homeostatic “pleasure principle” is fundamental — its provides a way to translate objective needs of the organism into conscious desires and emotions. This fits into a related line of thinking about brain receptor sensitivity that I wrote about in my post “ Change your receptors, change your set point “. Whenever there is a dysregulation of the pleasure principle, such as occurs in the appetite drive of obesity, but also in conditions such as depression or addiction, we should look within the control system itself to find out what is going wrong. And that is what the HH does, by looking for specific brain mechanisms that explain not only our subjective experience, but the way the rest of the body responds objectively in homeostatic response to physiological disturbances.
Like this article or disagree with it? Add you comments below, or join the more extended discussion in the Discussion Forum .
Landmarks in the study of circadian rhythms
Since the findings by de Mairan and Kleitman, numerous converging lines of evidence support the endogenous nature of circadian timing. First, in constant conditions, the period of circadian rhythms is approximately, but not precisely, 24 h. Were rhythms driven by daily cues such as the LD cycle or geophysical signals, these cycles would be precisely 24 h. Second, unlike most biological processes, where increased temperatures hasten biochemical processes and decreased temperatures have the opposite effect, circadian rhythms are temperature compensated, such that the period of the rhythm is unaltered by temperature changes (Pittendrigh & Caldarola, 1973 ). This result rules out the possibility that daily changes in temperature are responsible for circadian rhythmicity, although some rhythms can be entrained to cycles in temperature. Together, these findings provide strong suggestive evidence for the endogenous regulation of circadian rhythms.
Suprachiasmatic nucleus as a brain master clock
The discovery of the suprachiasmatic nucleus (SCN) and its identification as a master brain clock launched the study of circadian rhythms into a fruitful era of mechanistic studies. In 1972, two laboratories showed that the destruction of a very discrete hypothalamic area, the SCN, led to the permanent loss of circadian rhythms (Moore & Eichler, 1972 Stephan & Zucker, 1972 ). The finding of a neural locus for circadian timekeeping provides definitive evidence for the endogenous control of circadian rhythmicity. In the two decades following the characterization of the SCN as a master circadian clock, substantial support accumulated for the notion that, in mammals, this hypothalamic nucleus is an internal timekeeper, with a necessary role in circadian timing. The supporting evidence includes proof that, both in vivo (Inouye & Kawamura, 1979 ) and in vitro (Gillette & Prosser, 1988 ), the SCN is rhythmic when isolated from the rest of the brain. When transplanted from a fetal donor animal into an SCN-lesioned host, an SCN graft rescues rhythmicity and the restored behavior has the period of the donor rather than the host (Lehman et al., 1987 Ralph et al., 1990 ). In addition, the molecular mechanisms responsible for rhythm generation at the cellular level have been well characterized, and it has been shown that genetic mutations or knockout of essential clock genes leads to either arrhythmicity or gross deficits in circadian timekeeping of the SCN (Hastings et al., 2014 Partch et al., 2014 ).
The molecular clockwork
The SCN is comprised of about 20 000 cells that form a highly organized network to produce a coherent, tissue-level clock (Welsh et al., 2010 Partch et al., 2014 ). At the cellular level, circadian rhythms are generated by interlocked transcriptional/translational feedback loops consisting of ‘clock’ genes and their protein products (reviewed in Zhang & Kay, 2010 Hastings et al., 2014 Partch et al., 2014 ) (Fig. 1). In mammals, the core feedback loop consists of two transcriptional activators, CLOCK and BMAL1, and two transcriptional repressors, the PERIOD (PER) and CRYPTOCHROME (CRY) proteins (Huang et al., 2012 ). In the morning, CLOCK and BMAL1 activate transcription of the Per (Per1, Per2 and Per3) and Cry (Cry1 and Cry2) genes by binding to the E-box (CACGTG) domain on their gene promoters. Over the course of the day, the Per and Cry genes are translated into their respective proteins. PER and CRY proteins form heterodimers late in the day that translocate from the cytoplasm to the cell nucleus to inhibit CLOCK:BMAL1-mediated transcription. The timing of nuclear entry is balanced by regulatory kinases that phosphorylate the PER and CRY proteins, leading to their degradation (Lowrey et al., 2000 Shanware et al., 2011 ). REV-ERBα/ROR-binding elements (Preitner et al., 2002 ) act to regulate Bmal1 transcription via a secondary feedback loop. The transcriptional retinoid-related orphan receptor (ROR) is a transcriptional activator of Bmal1, whereas REV-ERBα, an orphan nuclear receptor, negatively regulates Bmal1. The same CLOCK:BMAL1 mechanism controlling Per and Cry gene transcription also controls transcription of REV-ERBα. This secondary feedback loop produces rhythmic expression of BMAL1, further stabilizing the clockwork. The clockwork at the cellular level is functionally similar across taxa, with interacting transcription/translation feedback loops driving rhythms at the cellular level. Importantly, clock genes themselves are not conserved across higher taxa, but transcriptional feedback loops and post-transcriptional controls are common mechanisms for the generation of cell-based oscillation (reviewed in Harmer et al., 2001 ).
Circadian rhythms and temporal niche
Circadian oscillation is key to understanding how organisms are synchronized to their local environments, and species-typical adaptations to their temporal niches are markedly influenced by environmental LD cycles (reviewed in Hut et al., 2012 ). As noted above, in mammals, photic input from the retina entrains the SCN, but somewhat surprisingly, the phases of SCN electrical, metabolic and molecular rhythms, relative to the light cycle, have the same daytime peaks in diurnally and nocturnally active species (reviewed in Smale et al., 2003 ). As an example, rhythms of Period gene expression in the SCN peak at approximately the same time of day in diurnal as in nocturnal rodents, suggesting that the phase of clock gene expression in the SCN relative to the LD cycle is conserved across mammalian groups, and implying that the signaling cascade initiating daily activity lay beyond the SCN. This phenomenon has piqued the interest of investigators, especially because there is significant evidence that switching of temporal niches can occur (Mrosovsky & Hattar, 2005 Gattermann et al., 2008 ). It appears that neural responses to light can mediate acute temporal-niche switching. Thus, a switch from nocturnal to diurnal activity rhythms occurs in wild-type mice transferred from standard intensity to scotopic levels of light in an LD cycle (Doyle et al., 2008 ). A similar switch from nocturnal to diurnal activity rhythms occurs in double-knockout mice, bearing little rod function, due to a lack of the inner-retinal photopigment melanopsin (OPN4) and of RPE65, a key protein used in retinal chromophore recycling. Interestingly, the rhythm of clock gene expression in the SCN is also reversed in these mice, suggesting that the nocturnal to diurnal switch is due to a change in the neural response to light upstream from the SCN. It is possible that a common mechanism produces phase reversals in Rpe65−/−Opn4−/− mice and in wild-type mice during dim LD cycles.
Ubiquity of circadian oscillators
The identification of ‘clock genes’ and the invention of reporter gene technology enabled the assessment of rhythmicity in cultured cells and tissues, such as SCN slice preparations. The technical developments and experimental findings based on assessing the activities of specific genes and proteins within cells and tissues has led to a reconceptualization of the circadian organization as a hierarchy of oscillators. This vision has brought the circadian timing system to the attention of a very broad clinical and basic research community. Ignoring circadian effects leads to errors of interpretation in basic research and can result in suboptimal diagnosis and treatments in medicine. Circadian clocks regulate the timing of gene expression in each organ, and the regulated genes are unique to each organ (Akhtar et al., 2002 Duffield et al., 2002 Miller et al., 2007 Hughes et al., 2009 Dibner et al., 2010 ). Thus, circadian control overlies the normal expression of tissue-specific genes and proteins. Not surprisingly, the maintenance of normal phase relationships among tissues and organs appears to be adaptive. Disrupting the circadian network can produce severe pathology (Litinski et al., 2009 Karatsoreos et al., 2011 ). Optimizing the circadian timing system for treatment, such as appropriately timing drug administration is a frontier research area (Levi & Schibler, 2007 and see below).
Hierarchical organization of the circadian timing system
Since the discovery of the SCN, and the consistent finding that most circadian rhythms are abolished following its destruction, it was generally assumed that the SCN was the only locus capable of independent circadian rhythm generation. In turn, all circadian rhythms throughout the brain and body were thought to be driven by downstream communication from the SCN. This notion was challenged following the observation that cultured fibroblasts exhibit circadian rhythms in gene expression following a serum shock (Balsalobre et al., 1998 ). With this experiment, it became clear that the ability to oscillate was a general property of tissues throughout the central nervous system and periphery (Damiola et al., 2000 Yamazaki et al., 2000 Yoo et al., 2004 ). The discovery that the SCN is not alone in the capacity to express endogenous oscillation was the beginning of a reconceptualization of the internal timekeeping system (Balsalobre et al., 1998 ).
It is now known that the circadian system is composed of multiple individual cellular oscillators located throughout the body and most of its organs and glands. For example, a role for intrinsic rhythmicity in other tissues has been demonstrated. An example of SCN-independent timekeeping is seen in the olfactory bulb (Granados-Fuentes et al., 2006 ). SCN lesions eliminate circadian locomotor rhythms, but not odor-induced c-Fos rhythms in the olfactory bulb or piriform cortex. Olfactory bulb oscillators drive rhythms in spontaneous and odor-evoked activity within the bulb and also in its primary synaptic targets in the piriform cortex. In the sense that olfactory bulb oscillators express circadian rhythms in the absence of the SCN, persist in constant darkness and are required for rhythms in the piriform cortex, these oscillators can be considered master circadian pacemakers in the olfactory system. That said, in the intact animal, under unperturbed conditions, the SCN sets the phase of the olfactory bulb and other independent oscillators. The SCN modulates temporal activity in these cellular oscillators, such that each bears regulated phase relationships to SCN pacemakers and hence to each other. Such findings led to the interpretation that the circadian clock mechanism modulates the activity of genes in a tissue-specific manner (Akhtar et al., 2002 Duffield et al., 2002 Miller et al., 2007 Silver & Lesauter, 2008 Hughes et al., 2009 ). This process can be accomplished either directly by CLOCK:BMAL1 activation through an E-box domain on their gene promoters (i.e. clock-controlled genes) or indirectly via downstream actions of clock-controlled gene products to optimize system-wide functioning on a daily schedule (Fig. 2). For example, the thrombomodulin, a cofactor for thrombin that is expressed on the surface of endothelial cells to reduce blood coagulation, gene contains an E-box domain in its promoter and is directly regulated by the CLOCK:BMAL1 complex (Takeda et al., 2007 ). The resulting rhythm in thrombomodulin probably contributes to daily changes in the likelihood of cardiovascular events. Generally, the risk of cardiovascular events peaks in the morning and evening the morning time point is associated with a daily peak in rhythmic cortisol and epinephrine, whereas the night-time peak is associated with peak blood pressure and a trough in cardiac vagal modulation (Scheer et al., 2010 ). These broad implications expanded the audience of investigators and disciplines attending to the workings of the circadian timing system. Not only were the salient phenomena, such as sleep–wake cycles, of immediate interest, but also the invisible circadian oscillations such as seen in the workings of the heart, or in the timing of cell division. Finally, the occurrence of sex differences in circadian rhythms (Bailey & Silver, 2013 ) and the demonstration of diseases associated with altered clock gene function rendered it necessary to consider the circadian timing system in a broad array of apparently unrelated disciplines, both applied and basic.
Unique features of the suprachiasmatic nucleus master clock
The finding of extra-SCN oscillators begged the question of the relationship of the brain master clock to these other clocks. What was the unique aspect of the SCN that lent it master clock function? Important discoveries included the finding that the cellular/molecular mechanism of the clock was similar in the SCN and in other tissues (Storch et al., 2002 Kamphuis et al., 2005 Liu et al., 2007 Miller et al., 2007 ). Thus, clock functioning in cells outside the SCN is equally vulnerable to disruption of the molecular clock (Liu et al., 2007 ). Although the intracellular core clock molecular mechanisms could not explain SCN master clock function, the unique pattern of its connections appeared to be responsible, i.e. the coupling of its neurons appeared to lend stability to the oscillation of the SCN tissue, and its unique inputs and outputs appeared to be the basis of its capacity to function as a master clock (Liu et al., 2007 Welsh et al., 2010 Hastings et al., 2014 ). Unlike the SCN, rhythmic clock gene expression in other central and peripheral tissues dampens within a few days in culture, suggesting a loss of coupling among oscillators that results in an inability to detect population-wide rhythmicity (Balsalobre et al., 1998 Abe et al., 2002 Wilsbacher et al., 2002 ). Indeed, this notion was confirmed by monitoring the single-cell bioluminescence of Per2::luciferase in mouse cultured fibroblasts and establishing that, despite the loss of population-wide rhythms in clock gene expression, single cells continued to show clear rhythms in Per2 expression (Welsh et al., 2004 Leise et al., 2012 ). These findings suggest that, in vivo, coherence among populations of subordinate oscillators is maintained through SCN communication. Also dramatic is the observation that, although SCN lesions abolish most circadian responses, some rhythms survive the ablation of the SCN. For example, SCN-lesioned animals continue to show circadian rhythms when treated with methamphetamine (Honma & Honma, 2009 ) and they also continue to show food anticipatory behavior, a response based on circadian timing (Saper, 2006 Patton & Mistlberger, 2013 ). These latter findings suggest that the methamphetamine-entrainable oscillator and the food-entrainable oscillator might share network coupling properties in common with the SCN.
Although cellular oscillators are virtually ubiquitous, the SCN is unique not only in terms of its ability to maintain rhythmic network-level stability, but also in its direct access to timing information. The SCN receives light information through a direct retino-hypothalamic tract to synchronize the master clock to environmental time (Morin & Allen, 2006 ). Historically, it was believed that the only photoreceptors present in the retina were rods and cones. This notion was questioned following the finding that mice lacking both rod and cone photoreceptors (retinally degenerate mice) exhibit normal photic entrainment despite being visually blind (Foster et al., 1993 ). By using a retrograde labeling strategy to identify retinal cells projecting to the SCN, it was discovered that a subset of retinal ganglion cells, expressing the photopigment melanopsin, were intrinsically photosensitive (Berson et al., 2002 Hannibal & Fahrenkrug, 2002 Hattar et al., 2002 Panda et al., 2002 ). As with the elimination of rod/cone signaling, elimination of melanopsin was not sufficient to abolish entrainment (Ruby et al., 2002 Lucas et al., 2003 ). Entrainment is only fully prevented in mice doubly mutant for both melanopsin and traditional rod/cone photoreceptors (Hattar et al., 2003 Panda et al., 2003 ). Underscoring the importance of connectivity, even though all photoreceptive classes can contribute to entrainment, this occurs through the conduit of the intrinsically photosensitive retinal ganglion cells ablating these cells alone (only
2% of all retinal ganglion cells) prevents entrainment (Schmidt et al., 2011 ). Together, these findings suggest that rod/cone photoreceptors project to intrinsically photosensitive retinal ganglion cells that then send projections to the SCN to communicate this integrated light information. Because subordinate oscillators do not have access to light information, their phase relative to external time must be maintained through communication from the master clock in the SCN under light-entrained conditions.
Chapter 12 - Lateral Hypothalamic Control of Sleep in the Context of Cancer
Sleep plays a vital role in health and well-being. Its conservation along nearly all branches of the phylogenetic tree suggests that sleep serves an essential biological function. Disrupted or truncated sleep is consistently linked to heart disease, depression, obesity, and more recently, cancer. Indeed, chronic sleep disturbance affects 10%–20% of the population in the developed world, representing a substantial public health problem that translates into an estimated financial burden in the hundreds of billions of dollars annually. Poor sleep is a strong predictor of subsequent mortality in cancer patients even when taking into consideration other risk factors including age, hormone receptor status, cortisol concentrations, depression, and metastatic spread. Despite the prevalence of these problems, the underlying mechanisms mediating cancer-associated changes in sleep are unknown. Here, we discuss recent evidence supporting a cross-talk among tumors in the periphery, the nervous, endocrine, metabolic, and immune systems leading to sleep and systemic disruption. Special emphasis is given to the lateral hypothalamus, which contains many neural populations that couple sleep to metabolic state, immune status, and the environment. Among these, hypocretin/orexin neurons have been the most well characterized. We further lay down logical “next-steps” in this research area that are likely to drive the development of novel treatments for cancer-associated sleep disruption.
At the core of human thought, for the majority of individuals in the developed nations at least, there is the tacit assumption that as a species we are unfettered by the demands imposed by our biology and that we can do what we want, at whatever time we choose, whereas in reality every aspect of our physiology and behaviour is constrained by a 24 h beat arising from deep within our evolution. Our daily circadian rhythms and sleep/wake cycle allow us to function optimally in a dynamic world, adjusting our biology to the demands imposed by the day/night cycle. The themes developed in this review focus upon the growing realization that we ignore the circadian and sleep systems at our peril, and this paper considers the mechanisms that generate and regulate circadian and sleep systems what happens mechanistically when these systems collapse as a result of societal pressures and disease how sleep disruption and stress are linked why sleep disruption and mental illness invariably occur together and how individuals and employers can attempt to mitigate some of the problems associated with working against our internal temporal biology. While some of the health costs of sleep disruption can be reduced, in the short-term at least, there will always be significant negative consequences associated with shift work and sleep loss. With this in mind, society needs to address this issue and decide when the consequences of sleep disruption are justified in the workplace.
Thus passes the day for the virtuous man. And when night comes, I take care not to summon sleep! He, the lord of the virtues, does not care to be summoned!
—Friedrich Nietzsche, Thus Spoke Zarathustra
1. Introduction and mammalian circadian rhythms
Almost all life on earth uses an internal biological clock to anticipate the profound changes that result from the Earth's rotation upon its axis. In organisms as varied as photosynthetic bacteria and humans, physiology and behaviour are ‘fine-tuned’ to the varied, yet predictable, demands of the day/night cycle. Creatures effectively ‘know’ the time of day, and these internally generated daily cycles are called ‘circadian rhythms’, which comes from the Latin circa (about) and dies (day) . In addition to the alignment of the internal and external day, a circadian clock also ensures that biological processes occur in the appropriate temporal sequence. For cells to function properly they need the right materials in the right place at the right time. Thousands of genes have to be switched on and off in a specific order. Proteins, enzymes, fats, carbohydrates, hormones, nucleic acids and other compounds have to be absorbed, broken down, metabolized and produced in a precise time window. Energy has to be obtained, and then partitioned across the cellular economy and allocated to growth, reproduction, metabolism, locomotion and cellular repair. Without this internal temporal compartmentalization, our biology would be profoundly compromised .
Circadian rhythms must also be synchronized or entrained to the external environment using signals that provide time of day information (zeitgebers), and the patterns of light produced by the Earth's 24 h rotation provide the dominant entrainment cue. However, in many species, other environmental zeitgebers such as temperature, food availability, rainfall and even predation can contribute to entrainment. The key point is that circadian rhythms are not driven by an external cycle but are generated internally, and then synchronized to the external 24 h world .
Relating this to our own species, human physiology is organized around the daily cycle of activity and sleep. In the active phase, when energy expenditure is higher and food and water are consumed, organs need to be prepared for the intake, processing and uptake of nutrients. The activity of organs such as the stomach, liver, small intestine, pancreas and the blood supply to these organs need internal synchronization, which a clock can provide. During sleep, although energy expenditure and digestive processes decrease, many essential activities occur including cellular repair, toxin clearance, and memory consolidation and information processing by the brain. Disrupting this pattern, as happens with jet lag or shift work (see below), leads to internal desynchrony and the failure to do the right thing at the right time .
At the heart of the circadian system of mammals is a structure located deep within the brains hypothalamus called the ‘suprachiasmatic nuclei’ or SCN (figure 1). The discovery of this structure has a fascinating history. Experiments in the 1950s and 1960s lesioned/destroyed small parts of the rat brain in the hunt for ‘the clock’, and narrowed it down to somewhere deep in the brain, probably the hypothalamus. Then in a series of experiments in the early 1970s, and based upon the logic that circadian rhythms are entrained by the light/dark cycle, structures within the hypothalamus were identified that received a direct projection from the eye. The SCN receives a major projection from the retina, and when the SCN was lesioned circadian rhythms were abolished [4,5]. Almost 20 years later the critical role of the SCN was confirmed by transplanting small neural grafts from the SCN region of a mutant hamster with a short circadian period of 20 h into non-mutant hamsters whose own SCN had been destroyed and 24 h rhythms were abolished. The transplant not only restored circadian rhythms, but critically, the restored rhythms were 20 h, showing that an essential component of the clock—its period, had been transplanted with the SCN .
Figure 1. The mammalian suprachiasmatic nuclei (SCN) and retina. (i) The mouse brain from the side showing the suprachiasmatic nuclei (SCN) which contains the master circadian pacemaker of mammals. The SCN receive a dedicated projection from the retina called the retino-hypothalamic tract (RHT) (ii) the frontal view of the brain shows the small-paired SCN which are located either side of the third ventricle and sit on top of the optic chiasm (where the optic nerves combine). In mice, the SCN comprises approximately 20 000 neurons, and in humans 50 000 neurons. See text for details. (iii) Retinal rods and cones convey visual information to the retinal ganglion cells (RGCs) via the second order neurons of the inner retina—the bipolar (B), horizontal (H) and amacrine (A) neurons. The optic nerve is formed from the axons of all the ganglion cells and this large nerve takes light information to the brain. A subset of photosensitive retinal ganglion cells (pRGC—shown in dark grey) can also detect light directly. The pRGCs use the blue light-sensitive photopigment, melanopsin or OPN4. Thus photodetection in the retina occurs in three types of cell: the rods, cones and pRGCs. The pRGCs also receive signals from the rods and cones, and, although not required, they can help drive light responses by the pRGCs.
The SCN of humans comprises about 50 000 cellular circadian oscillators sufficiently stable to generate circadian rhythms of neuronal firing for at least six weeks in vitro. This was first shown in dispersed SCN neurons from neonatal rats, placed into the culture on a grid of microelectrodes. Individual neurons displayed robust circadian rhythms in electrical firing, but the phases of these individual rhythms were all different, showing that SCN neurons act as individual clocks and that the basic oscillation lay within individual cells, and was not the emergent property of a network of individual neurons .
The subcellular molecular generation of a circadian oscillation arises from a complex interaction between key clock genes and their protein products. At its core, the molecular clockwork comprises a transcriptional/translational feedback loop (TTFL), whereby genes and their protein products interact and feedback to inhibit their own transcription, generating a 24 h cycle of protein production and degradation. The key elements of this molecular clockwork are illustrated in figure 2. For further details, see .
Figure 2. The mammalian molecular clock. The driving force of the mammalian molecular clockwork is transcriptional/translational feedback loop (TTFL) the transcriptional drive is provided by two proteins named ‘Circadian Locomotor Output Cycles Kaput’, or less torturously CLOCK (CLK), which links with ‘Brain muscle arnt-like 1’ or BMAL1. The CLK–BMAL1 complex binds to E-box promoters driving transcription of five core clock genes, three Period genes (Per) giving rise to the proteins PER1, PER2 and PER3, and two Cryptochrome genes (Cry) which encode the CRY1 and CRY2 proteins. The PER proteins combine with the kinase CK1 (Casein kinase 1) and are phosphorylated. The PER–CK1 complex then binds to the CRYs to form a CRY–PER–CK1 complex. Within the complex of CRY–PER–CK1, CRY and PER are phosphorylated by other kinases which then allows the CRY–PER–CK1 complex to move into the nucleus and inhibit CLK–BMAL1 transcription of the Per and Cry genes forming the negative limb of the TTFL. The CRY–PER–CK1 protein complex levels rise throughout the day, peak at dusk, are then degraded and decline to their lowest level the following dawn. The net result is a TTFL, whereby the Per and Cry genes and their protein products interact and feedback to inhibit their own transcription, generating a 24 h cycle of protein production and degradation. Note, multiple other genes and their proteins, generate additional feedback loops to provide further stability to the circadian oscillation . Significantly, polymorphisms in several of these clock genes have been associated with human ‘morning types’ (larks) and ‘evening types’ (owls) .
The SCN projects directly to approximately 35 brain regions, mostly located within the hypothalamus, and particularly those regions of the hypothalamus that regulate hormone release. Indeed, many hormones under pituitary control, like cortisol, are under tight circadian regulation [11,12]. Furthermore, the SCN regulates the activity of the autonomic nervous system, which acts to time-stamp many aspects of physiology, including the sensitivity of target tissues to hormonal signals . In addition to these direct neuronal connections, the SCN communicates to the rest of the body using diffusible chemical signals . This was first shown by transplanting SCN contained within tiny semi-permeable capsules into SCN-lesioned animals. The capsule prevented neural connections being re-established but allowed chemical communication from the transplanted SCN to diffuse out. Even without a neural connection, some circadian rhythms were restored . In more recent years the identity of these chemical signals has begun to emerge .
Although the SCN is the ‘master clock’ in mammals, it is not the only clock . There are cellular clocks, using essentially the same subcellular mechanisms (figure 2), within the liver, muscles, pancreas, adipose tissue and probably in every organ and tissue of the body . Destruction of the SCN abolishes multiple rhythms, such as locomotor activity, and this was the reason that the SCN was considered to ‘drive’ 24 h rhythmicity. However, it is now appreciated that the loss of overt rhythmicity occurs because (i) some of the individual peripheral clock cells dampen, and lose rhythmicity after several cycles but more commonly, because (ii) the individual cellular clocks become uncoupled from each other. The cells continue to tick, but at different phases so that an overt 24 h rhythm within the tissue or organ is lost . This discovery led to the appreciation that the SCN acts as a pacemaker to coordinate, but not drive, the circadian activity of billions of individual peripheral circadian oscillators throughout the tissues and organs of the body. The signalling pathways used by the SCN to entrain these peripheral clocks are still uncertain, but we know that the SCN does not send out countless separate signals around the body targeted at specific individual clocks. Rather, there seems to be a limited number of neuronal and humoral signals. The SCN also receives feedback signals from the periphery that allows the whole body to function in synchrony with the varying demands of the 24 h light/dark cycle . The result is a complex circadian network that coordinates rhythmic physiology and behaviour.
2. Shedding light on the clock
Eye loss in all groups of mammals abolishes the capacity to entrain circadian rhythms to the light/dark cycle . However, astonishingly, the visual cells of the retina, the rods and cones (figure 1), are not required for the detection of the dawn/dusk signal. There exists a third class of photoreceptor within the eye [21,22]. Our studies in the 1990s showed that mice lacking all rod and cone photoreceptors (rdta/cl and rd/rd cl) could still regulate their circadian rhythms to light perfectly normally. But when the eyes were removed the ability to entrain was lost [21,22]. These experiments showed that there had to be another photoreceptor within the eye. The rodless/coneless mouse models provided a powerful approach to characterize this third photoreceptor , and along with studies in the rat  and monkey , the retina was shown to contain a small population (around 1–2%) of photosensitive retinal ganglion cells (pRGCs) that use a blue light-sensitive photopigment called ‘melanopsin’ or OPN4. The OPN4 gene was originally isolated from the light-sensitive pigment cells or ‘melanophores’ found in the skin of amphibians, including frogs and toads . The name ‘melanopsin’ has stuck, and is often confused with ‘melatonin’ but the two molecules are entirely unrelated. Genetic ablation of the rods, cones and melanopsin-pRGCs eliminates circadian responses to light, demonstrating that there are no additional photoreceptors that contribute to circadian entrainment either in the eye or elsewhere. However, although the rods and cones are not required for circadian entrainment, they are now known to contribute to the light responses of the melanopsin pRGCs under certain circumstances. Genetic silencing of melanopsin in the pRGCs does not block photoentrainment in mice. Mice can still entrain but with reduced sensitivities [27–30]. Rods and cones send indirect projections to the pRGCs, and it seems that in the absence of endogenous OPN4, the rods and cones can partially compensate for the loss of OPN4. A complex pattern is emerging of how the different photoreceptor populations interact to bring about entrainment [31–33].
In collaboration with colleagues at Harvard University, we studied humans who had lost all of their rods and cones as a result of genetic disease. Just like the rodless/coneless mice, human circadian entrainment was found to be intact, mediated by pRGCs using the photopigment melanopsin maximally sensitive to blue light around 480 nm . This finding is having a significant impact in the clinic [35,36]. For example, genetic diseases that result in the loss of the rods and cones and cause visual blindness, often spare the pRGCs. Under these circumstances, individuals who have their eyes but are visually blind, yet possess pRGCS, should be advised to expose their eyes to sufficient morning and evening light to entrain their circadian system. The realization that the eye provides us with both our sense of space and our sense of time, via entrainment of the SCN, is redefining the definition and treatment of human blindness.
3. Biology of sleep
The regular cycle of sleep and wakefulness is the most obvious 24 h pattern in our behaviour, and the sleep/wake cycle involves a highly complex set of interactions involving multiple neural circuits, neurotransmitters and hormones, none of which are exclusive to the generation of sleep [37,38]. The major brain structures and neurotransmitter systems involved in the sleep/wake cycle are summarized in figure 3.
Figure 3. Sleep/wake states arise from mutually excitatory and inhibitory circuits that result in two distinct behavioural states of wake (consciousness) and sleep. The diagram illustrated here represents a greatly simplified version of the interactions associated with the wake/sleep switch. During wake orexin (also known as hypocretin) neurons in the lateral hypothalamus project to and excite (+) different populations of wake-promoting neurons within the hind- and mid-brain including monoaminergic neurons which release histamine, dopamine, noradrenaline and serotonin cholinergic neurons in the hind-brain which release acetylcholine and an important group of broadly distributed neurons that release glutamate. These neurotransmitters drive wakefulness and consciousness within the cortex. In addition, acute activation of the stress axis (figure 5) will also contribute to sleep/wake regulation, acting to promote wake and inhibit sleep. During wake, the monoaminergic neurons project to (dotted line) and inhibit (−) the ventrolateral preoptic nuclei (VLPO). During sleep, circadian and homeostatic sleep drivers (figure 4) activate the VLPO which releases the neurotransmitters gamma-aminobutyric acid (GABA) and galanin to inhibit the orexin neurons in the lateral hypothalamus, and the monoamninergic, cholinergic and glutamatergic neuronal populations (−) directly. Further, a subpopulation of interneurons in the cortex project long distances to the cerebral cortex and release the inhibitory neurotransmitter GABA during sleep. These neurons are activated during sleep in a manner proportional to the homeostatic sleep drive for sleep (figure 4). The primary measures used to define sleep in mammals is the electroencephalogram (EEG) which characterizes sleep as either rapid eye movement (REM) or non-rapid eye movement (NREM) states. The NREM–REM switch occurs every approximately 60–90 min and is driven by a network of neurons within the mid- and hind-brain. During REM sleep monoaminergic neurons remain inhibited, but cholinergic neurons are activated (+). REM-on neurons project to the spinal cord and drive muscle paralysis (atonia) . If the atonia pathway fails to activate, conditions termed REM sleep behaviour disorder (RBD) can arise. Further, the level of loss of atonia can predict the development of Parkinson's disease . It is worth emphasizing that we have only a rudimentary understanding of the real function of REM versus NREM sleep .
The complex interactions associated with sleep/wake generation are regulated, under normal circumstances, by two endogenous drivers, termed the homeostatic process (Process S), which increases as a function of wakefulness and a circadian process (Process C). This has been termed the ‘Two-Process’ model of sleep which broadly explains how the sleep/wake cycle is aligned to the night/day cycle  (figure 4).
Figure 4. A depiction of the two-process model of sleep regulation . A 24 h circadian timer (process C) and a homeostatic driver (process S, dotted line) interact to determine the timing, duration and structure of sleep. A circadian driven rhythm of sleep promotion during the night and wake during the day is opposed by a homeostatic driver which increasingly promotes sleep (S) during the day, and then during sleep, homeostatic sleep pressure is dissipated towards the end of the sleep episode. The time of day most suitable for sleep—the ‘sleep window’ occurs as a result of the combined effects of the circadian and homeostatic drivers. Sleep pressure within the sleep window will be highest during the first part of the night but increasingly reduced as the homeostatic drive for sleep dissipates towards the end of the night. During sleep most humans experience 4–5 cycles of NREM/REM sleep and, without the influence of an alarm clock, we wake naturally from REM sleep .
In humans, several agents have been implicated in driving sleep homeostasis, and adenosine has emerged as a strong candidate . Adenosine increases in the brain during wake and after enforced sleep deprivation. Furthermore, perfusion of adenosine into the brain of freely moving rodents reduces wakefulness and activates neurons associated with sleep promotion. Caffeine is a potent stimulant and alerting agent, and seems to function by blocking adenosine receptors. Other sleep-promoting factors include prostaglandin . The circadian process seems to drive both wake- and sleep-promoting behaviour, promoting the maintenance of wakefulness during the day, opposing the increasing homeostatic drive for sleep, while at night Process C promotes sleep (figure 4). The model depicted in figure 4 has been very powerful in understanding the basic interactions between the circadian system and homeostatic drivers regulating sleep, but in reality, the regulation of sleep is likely to be much more complicated. Not least, that sleep in humans and other animals is often not a single consolidated block of sleep but can be ‘biphasic’ or even ‘polyphasic’ with two or more periods of sleep separated by short periods of wake [45,46]. How such fragmentary sleep is generated is uncertain and will require additional inputs to the model depicted in figure 4.
Melatonin is often confusingly termed the ‘sleep hormone’, and this is misleading. Melatonin is synthesized mainly in the pineal gland, although the retina and other regions of the body can also produce small amounts. The pineal is regulated by the SCN to produce a circadian pattern of melatonin release, with levels rising at dusk, peaking in the blood around 02.00–03.00 and then declining before dawn. Light, detected by the pRGCs, also acts to inhibit melatonin production acutely . As a result, melatonin acts as a biological marker of the dark. In relation to sleep, melatonin receptors located on SCN neurons are thought to detect nocturnal melatonin to provide an additional zeitgeber for clock entrainment, reinforcing light entrainment signals from the eye [48,49]. However, although some studies suggest that taking melatonin may shorten sleep latency (the time taken to fall asleep) and increase total sleep time, the effects of melatonin , and agonists of melatonin, on sleep are modest . While melatonin production occurs at night during sleep in diurnal animals such as humans, nocturnal animals like mice and rats also produce melatonin at night when they are active. Certainly, sleep propensity in humans is closely correlated with the melatonin profile but this may be correlation and not causation. Indeed, individuals who do not produce melatonin (e.g. tetraplegic individuals, people on beta-blockers or pinealectomized patients) still exhibit circadian sleep/wake rhythms with only very minor changes in sleep .
4. Sleep loss and harmful stress
Harmful stress, defined here as ‘a physical, mental, or emotional stimulus that results in impaired health or performance’, can arise from disrupted or shortened sleep, and is a common feature across many sectors of society, from teenagers , the business and public sectors, such as night shift workers , to the elderly . Inadequate sleep usually means a sleep duration shorter than 7–8 h every 24 h. However, there is considerable individual variation, and self-assessments of sleep-need are very important. The main symptom of sleep loss is excessive daytime sleepiness, but a combination of the criteria in table 1 are additionally helpful for self-assessments. For additional background see .
Table 1. Self-assessments of sleep-need. There is considerable individual variation in sleep duration and timing. As a result, it is important for individuals to define their own sleep needs using some or all of the criteria listed here. Once sleep-need is established, then sleep timing and duration should be defended by altered behaviours.
The sleep loss experienced by night shift workers can be profound. Shift workers try and sleep during the day and invariably experience shorter (less than 5–6 h in every 24 h) and more disrupted sleep. In effect, shift workers are at work when their biology is in the sleep state and then try to sleep when their biology is prepared for wake. Irrespective of the years spent on a permanent night shift, nearly all (approx. 97%) of night shift workers do not adjust to the nocturnal regime but remain synchronized to daytime . This is directly related to light exposure. Artificial light in the office or factory is dim compared to environmental light. Shortly after dawn, natural light is some 50–100 times brighter than the 300–400 lux experienced in the workplace, and by noon natural light is 500–1000 times brighter . After leaving the night shift, an individual will usually experience bright natural light during the day and the circadian system will always lock onto the brighter light signal as daytime and align internal biology to the diurnal state. In one study, night shift workers were exposed to 2000 lux in the workplace and then completely shielded from natural light during the day. Under these circumstances, they became nocturnal. However, this is not a practical solution for most night shift workers .
Many individuals, and night shift workers, in particular, experience chronic sleep deprivation and circadian rhythm disruption , and there is increasing evidence that such problems act together to alter the release of corticosteroids (cortisol) regulated by the hypothalamic-pituitary-adrenal (HPA) axis. Cortisol release starts with the stimulation of the pituitary gland to release adrenocorticotropin (ACTH) into the blood. ACTH reaches the adrenal gland and stimulates the adrenal cortex to release glucocorticoids (corticosteroids). ACTH release is under circadian control, resulting in high levels of cortisol being secreted just prior and during the active part of the day, with lower levels of release towards evening and sleep. In addition to this 24 h variation, there is an ultradian rhythm of ACTH release that drives pulses of cortisol secretion from the adrenal cortex. Under normal circumstances, the circadian and pulsatile release of cortisol helps regulate and ‘fine-tune’ metabolic and immune responses to the varied demands of activity and sleep . Under conditions of sleep disruption (and other stressors), the HPA is activated acutely resulting in elevated levels of ACTH which then drives high levels of cortisol.
In addition to the elevation of cortisol, sleep deprivation activates the sympatho-adreno-medullary (SAM) drive, which, via the sympathetic nervous system, stimulates the release of catecholamines (primarily epinephrine/adrenaline) from the adrenal medulla. Chronically elevated cortisol and adrenaline together drive a wide-spread stress response that if sustained will mobilize and release glucose into the bloodstream while reducing insulin release increase heart rate and blood pressure suppress the immune response slow digestion limit tissue repair, and can reduce memory consolidation and cognitive function . If sustained, such physiological changes promote poor health and an increased difficulty coping with life (figure 5). The abnormal HPA secretory activity seen in shift workers and chronic insomniacs can be replicated in laboratory studies. For example, in one study, healthy young males were only allowed to sleep 4 h over six consecutive nights. This resulted in increased levels of cortisol in the afternoon and early evening and the rate of decrease of free cortisol in saliva was approximately six times slower in sleep-restricted individuals compared to rested controls [63,64]. Furthermore, chronic short sleepers have higher levels of cortisol compared to normal sleepers .
Figure 5. The impact of chronic sleep disruption and reduced sleep on the promotion and interaction of physiological stress via the hypothalamic-pituitary-adrenal (HPA) and sympatho-adreno-medullary (SAM) axes and psychosocial stress whereby sleep loss and fatigue result in an imbalance between the demands placed upon an individual and an inability of the individual to manage these demands. Ultimately, the combined and interlocking effects of physiological and psychosocial stress lead to emotional, cognitive and physiological pathologies (table 2).
Table 2. The impact of chronic sleep and circadian rhythm disruption (SCRD) upon human emotional responses, cognition, physiology and health. Such associations have long been a concern for shift workers, who suffer from extreme forms of SCRD. Citations: fluctuations in mood [72–75], depression and psychosis [76–79], anxiety, irritability, loss of empathy, frustration [80–82], risk-taking and impulsivity [83–86], negative salience , stimulant, sedative and alcohol abuse [88–92], illegal drug use  impaired cognitive performance and the ability to multi-task [94–96], memory, attention and concentration [97–100], communication and decision-making [90,101–104], creativity and productivity [105–108], motor performance [96,109], dissociation/detachment [110,111] day time sleepiness, micro-sleeps, unintended sleep [112–115], altered stress response [116,117], altered sensory thresholds [118–120], impaired immunity and infection [121,122], cancer [123–125], metabolic abnormalities and diabetes II [63,126–129], cardiovascular disease [129–131].
In addition to the activation of the HPA and SAM arms of the stress response, chronic sleep disruption gives rise to sleepiness and fatigue that can precipitate psychosocial stress . Under these circumstances, an individual will experience an imbalance between the demands placed upon them and their perceived inability to manage these demands. This impaired ability to cope with the demands of life acts as an additional stressor to augment the activation of the HPA and SAM stress responses, and can lead directly to behavioural changes including frustration and low self-esteem, increased worry, anxiety and depression. Such behaviours promote further sleep loss and fatigue . The relationships and consequences associated with sleep disruption, the chronic release of cortisol and adrenaline and psychosocial stress are summarized in figure 5.
5. The varied impacts and consequences of sleep disruption
The complexity of sleep generation and regulation (figures 3 and 4) renders this behavioural state very vulnerable to sleep and circadian rhythm disruption (SCRD) from multiple causes. For clarification, the term ‘SCRD’ is used as an ‘umbrella term’ in this review to refer to any form of sleep or circadian disruption, and does not distinguish between cause and effect. Thus, insufficient sleep arising from lack of opportunity and sleep loss due to disease would both be examples of SCRD. The term SCRD encompasses problems with the quality, timing and amount of sleep, and includes the 83 types of disorder included in the International Classification of Sleep Disorders (ICSD) 3rd edition . The ICSD divides sleep disorders into seven main categories: (i) Insomnia (difficulty falling asleep or staying asleep) (ii) Sleep-related breathing disorders (e.g. obstructive sleep apnoea) (iii) Central disorders of hypersomnolence (e.g. Narcolepsy) (iv) Circadian rhythm sleep–wake disorders, illustrated in figure 6) (v) Parasomnias (e.g. sleep walking and night terrors) (vi) Sleep-related movement disorders (e.g. restless legs syndrome) (vii) Other sleep disorders, which do not fulfil the criteria of the other six classifications. Some of the consequences of these seven categories are illustrated in figure 6.
Figure 6. Illustration of altered sleep patterns arising from multiple causes. Filled horizontal bars represent periods of sleep on consecutive work days and at the weekend. Advanced sleep phase disorder (ASPD) is characterized by difficulty staying awake in the evening and difficulty staying asleep in the early morning. Typically, individuals go to bed and rise about 3 or more hours earlier than the societal norm delayed sleep phase disorder (DSPD) is characterized by 3 h delay or more in sleep onset and offset. This often leads to greatly reduced sleep duration during the working week and extended sleep on free days. ASPD and DSPD can be considered as pathological extremes of morning (lark) or evening (owl) preferences. It is important to stress that ASPD and DSPD are not merely shifted sleep/wake patterns, but conditions that cause distress or impairment because they conflict with the schedules demanded by societal pressures or personal preferences Free-running or non-24-h sleep/wake disorder describes a condition where an individual's sleep occurs later and later each day. This has been observed in individuals with complete eye loss or other conditions such as schizophrenia irregular or completely fragmented sleep is typically observed in individuals who lack a circadian clock. Note: ASPD, DSPS, free-running and irregular sleep/wake patterns are most often, but not exclusively, linked to circadian rhythm abnormalities. Insomnia can be used to describe both a symptom or a disorder, and if a disorder describes a condition that leads to difficulty falling asleep or staying asleep, even when a person has the chance to do so. Insomnia is frequently associated with reduced sleep (hyposomnia) and can arise from multiple causes .
As illustrated in figure 5, a major driver of SCRD arises from chronic stress resulting from physiological and/or psychosocial factors. However, there are multiple additional and interlinked drivers of SCRD. These relationships are illustrated in figure 7. In brief, multiple illnesses, and illness resulting in pain or other discomfort are a major driver for SCRD . In addition, the varied problems that arise from the 24/7 society (extended working hours, reduced sleep, shift work, jet lag—essentially working against the biological drivers of sleep) also predispose individuals to SCRD. As illustrated in figure 7, the net result is circadian rhythm disruption, insomnia and sleep deprivation and disturbed social behaviours. These initial impacts can then lead to fatigue (a feeling of lack of energy and motivation that can be physical, mental or both) daytime sleepiness (persistent and overwhelming sleepiness during the day) and psychosocial anxiety (imbalance between the demands placed upon an individual and their perceived failure to manage these demands) [70,71]. Fatigue, daytime sleepiness and psychosocial disruption can precipitate a global disruption in physiology, important behavioural changes and the chronic activation of the physiological stress axis. The short- and long-term consequences in emotional, cognitive and physiological health are summarized in table 2. It is important to note that many of these interactions are bi-directional resulting in a matrix of positive feedback loops that can reinforce each other and precipitate a major breakdown in health and overall wellbeing.
Figure 7. The drivers of emotional, cognitive and physiological poor health. Factors such as Illness, illness resulting in pain, stressful situations and/or the impact of shift work and the 24/7 society can all lead to disrupted circadian rhythms, insomnia, sleep deprivation and abnormal patterns in social behaviour. Collectively these problems can give rise to fatigue, daytime sleepiness and psychosocial anxiety arising from altered patterns of social interaction. These altered behaviours will, in turn, disrupt physiology (e.g. metabolic abnormalities), drive abnormal patterns of behaviour (promote the use of stimulants and sedatives) and stimulate physiological stress (chronic release of cortisol and adrenaline). Collectively, this cascade of events underpins short- and long-term somatic and mental illness (table 2). Furthermore, it is important to appreciate that many of these interactions are bi-directional, acting to reinforce each other via multiple positive feedback loops.
As summarized in table 2, chronic SCRD, of the sort experienced by shift workers or other groups experiencing SCRD, can lead to an increased risk in serious health conditions. For example, nurses are one of the best-studied groups of night shift workers and many years of shift work has been associated with a broad range of health problems including type II diabetes, gastrointestinal disorders and even breast and colorectal cancers. Cancer risk increases with the number of years of shift work, the frequency of rotating work schedules, and the number of hours per week working at night . The correlations are so strong that shift work is now officially classified as ‘probably carcinogenic [Group 2A]’ by the World Health Organization. Other studies of shift workers show increased heart and stroke problems, obesity and depression (table 2). A study of over 3000 people in southern France found that those who had worked some type of extended night shift work for 10 or more years had much lower overall cognitive and memory scores than those who had never worked on the night shift . Similar findings have been shown in long-haul airline pilots and aircrew [134,135].
SCRD also impairs glucose regulation and metabolism. Under laboratory conditions, sleep restriction in healthy young men led to signs of insulin resistance, which can ultimately lead to type II diabetes. Two gut hormones, leptin and ghrelin, seem to play a key role in this process. Leptin is produced by fat cells and is a signal of satiety ghrelin is produced by the stomach and signals hunger, particularly for sugars. Together, these hormones regulate hunger and appetite. Restricting the sleep time of healthy young men under laboratory conditions for 7 days caused their leptin levels to fall (approx. 17%) and their ghrelin levels to rise (approx. 28%), and increased their appetite, especially for fatty and sugary foods (increased by 35–40%) . Such a SCRD-induced distortion of appetite may be the explanation why shift workers have a higher risk of weight gain, obesity and type II diabetes. Significantly, night shift workers have elevated levels of the stress hormone cortisol, which has also been shown to suppress the action of insulin and raise blood glucose . At another level, there is also a striking association between SCRD and smoking. For example, independent of social background and region, the number of smokers in the population increases with higher levels of SCRD . Further, the consumption of alcohol and caffeine increases with SCRD . Finally, and based upon the scores from the Beck Depression Inventory , the tendency towards depression increases when work times are not compatible with circadian sleep times , which leads to the next topic.
6. The special case of sleep disruption in mental illness
SCRD is a common co-morbidity in numerous psychiatric disorders . Most studies have focused upon mood disorders, especially unipolar depression and seasonal affective disorder, yet SCRD is also prominent in the more severe, psychotic disorders such as schizophrenia [140,141]. Interestingly, such links between schizophrenia and abnormal sleep pre-date observations in mood disorders, and were first described in the late nineteenth century by the German psychiatrist Emil Kraepelin . Today, clinical levels of insomnia are reported in more than 80% of patients with schizophrenia, and SCRD is increasingly recognized as one of the most common features of the disorder . SCRD in schizophrenia is very variable, and the abnormal sleep/wake patterns illustrated in figure 6 have all been noted [142–147]. Importantly, schizophrenia patients with SCRD score badly on many quality-of-life clinical subscales, highlighting the human cost of SCRD [143,148,149], and, significantly, schizophrenia patients often comment that an improvement in sleep is one of their highest priorities during treatment . It is also becoming clear that SCRD impacts upon the onset, outcome and relapse of mental illness [151–153]. These findings suggested that there are causal relationships between SCRD and psychoses, perhaps mediated via common (or overlapping) mechanisms .
The association of mental illness in general and SCRD has until recently been considered to arise from exogenous factors including social isolation, antipsychotic medication and/or activation of the stress axis . Such a linear explanation between psychosis and SCRD now appears to be overly simplistic. For example, our studies have addressed this association by examining SCRD in patients with schizophrenia and comparing these individuals with unemployed control subjects . The results demonstrated that severe SCRD exists in schizophrenia and persists independently of antipsychotic medication. Further, sleep disruption cannot be explained on the basis of lack of employment as unemployed individuals show remarkably stable sleep/wake patterns . These results are consistent with an alternative hypothesis, which suggests that psychoses and SCRD may share common and overlapping mechanistic pathways . As discussed above, the sleep and circadian timing system is the product of a complex interaction between multiple genes, brain regions, neurotransmitters and modulatory hormones (figures 2 and 3). As a consequence, abnormalities in any of the underlying neurotransmitter systems that predispose individuals to mental illness would almost certainly impinge upon the sleep/circadian timing systems at some level. Similarly, psychosis involves several distributed brain circuits, affecting a range of neurotransmitter systems, many of which overlap with those underlying sleep and circadian rhythm generation . Viewed in this context, it is no surprise that SCRD is common in psychoses, or that SCRD will, in turn, have wide-spread effects, ranging across many aspects of the neural and neuroendocrine function as outlined in table 2. Significantly, many of the pathologies caused by SCRD (table 2) are reported routinely as co-morbid with neuropsychiatric illness but are rarely linked to the disruption of sleep. Furthermore, the consequences of SCRD result in abnormal (reduced) light exposure and atypical patterns of social behaviour (figure 7), closing a vicious cycle to further destabilize sleep/circadian physiology [155,156]. The common and overlapping mechanisms of psychosis and SCRD are illustrated in figure 8. Critically, these relationships explain how relatively small changes in either the impact of SCRD or mental illness will be amplified by physiological feedbacks to increase an individual's vulnerability to neuropsychiatric illness and co-morbid health problems.
Figure 8. Diagram illustrating the possible relationship between mental illness and sleep and circadian rhythm disruption (SCRD). The diagram illustrates the hypothesis that mental illness and SCRD share common and overlapping pathways within the brain. As a result, an altered pattern of neurotransmitter release (shown as Δ Delta) that predispose an individual to mental illness will result in a parallel impact upon the sleep/circadian systems. Disruption of sleep (shown as α) will, likewise, impact upon multiple aspects of brain function with both short- and long-term consequences in emotional, cognitive and physiological health (figures 5 and 7 and table 2), and in the young may even have developmental consequences. The consequences of mental illness (shown as β), giving rise to psychosocial (e.g. social isolation) and physiological stress (figures 5 and 7), along with the impact of medication, will impinge upon the sleep and circadian systems. A positive feedback loop could rapidly be established whereby a small change in neurotransmitter release could be amplified via positive feedback loops into more pronounced SCRD and poorer mental health.
The conceptual framework outlined in figure 8 allows four explicit predictions, all of which can be strongly supported by recent findings. Specifically, (i) genes linked to mental illness will play a role in sleep and circadian rhythm generation and regulation [154,157] (ii) genes that generate and regulate sleep and circadian rhythms will play a role in mental health and illness  (iii) that SCRD will precede mental illness under some circumstances  (iv) that SCRD amelioration will have a positive impact upon mental illness. For this prediction, it is worth mentioning one recent publication. In this study, the aim was to determine whether treating insomnia would reduce levels of paranoia and hallucinations in university students with insomnia. The trial involved a randomized controlled trial at 26 UK universities, and students with insomnia were randomly assigned to receive either digital cognitive behavioural therapy for insomnia (CBTi) (n = 1891) or no intervention (n = 1864). The primary outcome measures were for insomnia, paranoia and hallucinatory experiences. The results showed that a reduction in insomnia, by using CBTi, was correlated with a highly significant reduction in paranoia and hallucinations over the study period. The study concluded that insomnia is a causal factor in the occurrence of psychotic experiences and other mental health problems . These findings are highly significant as they show that treatments for SCRD represent a potentially new and powerful therapeutic target for the reduction of symptoms in mental illness (figure 8).
7. Potential actions as an individual and as an employer
Based upon the findings summarized above, we can legitimately ask the question: ‘what evidence-based approaches can be used to mitigate the causes and consequences of SCRD, both as individuals and as employers?’ Some possible actions are summarized in tables 3 and 4, and discussed below.
Table 3. Individual actions to achieve better sleep. Actions that can be undertaken during the day, before going to bed, in the bedroom (sleeping space) and in bed to help improve sleep. Such approaches, either alone or in parallel with clinically directed cognitive behavioural therapy for insomnia (CBTi), can lead to a marked improvement in sleep.
Table 4. SCRD-induced employee issue and potential employer responses. On the basis of existing information, employers should establish ‘best practice’ approaches to mitigate some of the inevitable consequences of SRCD in the workplace (table 2). However, it must be emphasized that as a society we have to acknowledge that shift work and patterns of employment that disrupt sleep are detrimental to health, and that currently there is no way of eliminating completely the health problems listed in table 2.
7.1. Actions as an individual
As outlined in table 3, there are a range of relatively simple actions that individuals can undertake to help improve insomnia. Such interventions fall under the general term of ‘cognitive behavioural therapy for insomnia’ or CBTi, which aims to control the environment and the behaviours that precede sleep. Key aspects of CBTi are considered below and are targeted at four different times:
(i) During the day. Individuals should get as much morning natural light as possible, as this has been shown to advance the circadian clock . An earlier bedtime results in extended sleep. Note that a minority of individuals who are very early chronotypes, who go to bed and get up very early, might benefit from late afternoon/evening light exposure which would delay the clock and align them more closely to the rest of the population in the absence of natural light, timed light exposure using a light box has also been shown to be helpful for some sleep/wake timing problems  if there is a need to nap, the nap should be approximately 20 min as recovery from longer naps can lead to sleep inertia (impaired cognitive and sensory-motor performance) . In addition, naps close to bedtime (within approx. 6 h) will act to reduce the homeostatic drive for sleep (figure 4), and this will delay sleep onset  exercising close to bedtime will elevate core body temperature and this may delay sleep onset in some people, particularly if the exercise is very vigorous . The delayed sleep may be linked to the fact that sleep initiation seems to involve/require a small reduction in core temperature , and exercise may override this circadian driven change in body temperature feeding during the latter part of the day has been shown to predispose individuals to weight gain and increased susceptibility to metabolic abnormalities such as diabetes II . Weight gain can predispose to obstructive sleep apnoea where the walls of the throat relax and narrow during sleep, interrupting normal breathing . If untreated this can lead to multiple health issues. For example, obesity, diabetes and other sleep difficulties, such as restless legs syndrome . In addition, digestive processes (e.g. gut mobility, release of digestive enzymes) are reduced towards the evening. If the major meal of the day is prior to bedtime, this can predispose individuals to digestive health problems such as excessive stomach acid production and a greater risk of stomach ulcers  caffeine can have a major alerting effect on the brain as it blocks the receptors in the brain that respond to adenosine which provides one of the homeostatic drivers for sleep (figure 4). There is considerable individual variability in responses to caffeine, depending on body weight, pregnancy status, medication, liver health and caffeine exposure, but in healthy adults, the half-life is approximately 5–6 h (the time required to reduce caffeine to half of its initial value). As a result, a strong coffee or tea in the afternoon could delay sleep onset  short-term emotional stress is also a very powerful agent for the disruption of sleep (figure 7) . As a result, try to resolve stressful situations during the day.
(ii) Before bed. Light can have an alerting effect upon consciousness and lead to delayed sleep onset . As a result, reducing light exposure 30 min prior to bed may be useful physiologically (reducing alertness) and perhaps psychologically as part of a routine of ‘sleep preparation’. There has also been extensive and a somewhat confusing discussion regarding the impact of light intensity and wavelength prior to bedtime on shifting the circadian system. The light around dusk and in the evening will delay the circadian system while light in the morning will advance the circadian system. This fact has been used to support the argument that computer or smartphone use prior to bedtime will disrupt sleep/wake timing, thus promoting later sleep times. Further, the software has been developed to shift the colour spectrum of computer screens, lowering the blue light content and so reducing the activation of the ‘blue light sensitive’ pRGCs (figure 1). While this is logical, the impact of different colours of light on alertness is complex , and the extent to which the light from screens before bedtime represents a significant problem remains unresolved. For example, one recent study compared the impact of reading a light-emitting ebook (LE-eBook) versus reading from a printed book for 4 h prior to bedtime. The light intensity of the LE-eBook was on the maximum setting (approx. 31 lux), while the light reflected off the page of the printed book was approximately 0.9 lux. The results showed that LE-eBook use delayed sleep onset by less than 10 min compared to reading the print book. Although the results were statistically significant, a delay of 10 min is not particularly noteworthy . In summary, while it is probably sensible to minimize light exposure prior to bedtime to reduce levels of alertness, and to prepare psychologically for bed, the impact of light from digital devices needs further investigation. Although the physiological impact of light exposure from screen use remains unresolved, technology-related behaviours (games, computer or phone use) does impact negatively on sleep and daytime function, and is a particular problem for teenagers . The use of prescription sedatives to aid sleep can be useful in the short-term to adjust sleep, but long-term use, particularly in night shift workers, can cause problems because of side-effects. For example, the chronic use of benzodiazepines (e.g. Xanax, Valium, Ativan and Librium), which are anti-anxiety medications and increase drowsiness, are potentially addictive and can lead to impaired memory consolidation and reduced attention during the day . Non-prescription sedatives such as alcohol and antihistamines (e.g. diphenhydramine and doxylamine) should be avoided, as the side-effects can severely impact upon health and daytime functioning . It is important to avoid the discussion or consideration of stressful topics immediately before bed. The acute elevation of cortisol and adrenaline will increase alertness and delay sleep (figure 5) a relaxing behaviour, such as a bath or shower, or warming the hands and feet , can be useful as it may promote peripheral vasodilation, and the lowering of core body temperature which can help sleep initiation. In addition, a bath can be part of a bedtime routine that psychologically prepares you for sleep.
(iii) The bedroom. Making the bedroom or sleeping space suitable for sleep is a much overlooked yet critical part of getting satisfactory sleep. If the bedroom is too warm, this will affect the ability to lower core body temperature, and hence delay sleep onset. Ideally, the bedroom should promote sleep by minimizing distractions and stimuli that alert the individual. The sleeping space should be quiet and dark, and devices such as televisions, computers, smartphones should be removed. Smartphones are now used routinely as alarm clocks, and so removing them from the sleeping space can be problematic. However, if the phone is a distraction, then it should be replaced by an alarm clock but this is also not straightforward. Many individuals ‘clock watch’ and get anxious about the amount of time left available for sleep and constantly check and re-check the alarm clock generating more anxiety . Under these circumstances, the alarm setting can be used, but the face of the clock should be covered. Sleep apps can be useful in providing a quantitative measure for sleep duration, and in this regard, most are reasonably accurate. By contrast, measures of REM versus non-REM or even ‘deep sleep’ are more difficult to assess from the currently available devices, and may even be profoundly misleading. In theory, such monitoring systems could be useful in recording that a change in behaviour (table 3) has translated into improved sleep. But because most of the commercial apps available fail to provide an accurate measure of overall sleep, individuals can become anxious if their device inaccurately reports ‘insufficient restful sleep’ or ‘low levels of REM sleep’. The lack of validation or FDA approval of the currently available devices is an additional concern, and it is worth noting that very few sleep apps have been endorsed by the sleep academies or sleep specialists . As a result, it would be wise not to take sleep apps too seriously.
(iv) In bed. Keeping a good bedtime routine of getting-up and going to bed at the same time, and importantly, at a time that is optimal for sleep-need, has been shown to be important for maintaining good sleep . Such a schedule reinforces the exposure to environmental zeitgebers, especially light and food, which act to entrain the circadian system and stabilize the sleep/wake cycle. Individuals who are ‘natural long sleepers', needing 9 h or more of sleep each night, may not be able to achieve sufficient sleep during the working week, and it remains unclear whether these individuals might benefit from oversleeping on free days. A good mattress, pillows and bedding make intuitive sense for good sleep, but surprisingly, strong empirical evidence for mattress quality is lacking . Bedside lights should be bright enough for reading, but kept as low as possible to reduce alertness. Relaxing oils are often proposed to help improve sleep. However, the evidence-base is largely absent , and any effects could well be placebo. Further research is needed, but for some individuals relaxing oils do anecdotally improve sleep. Perhaps because the association of a distinctive ‘conditioning’ smell, such as lavender, can be part of a bedtime routine that psychologically prepares individuals for sleep. Ear plugs can help if a sleeping partner snores, or if there is external noise . If a partner's snoring becomes too disruptive, then an alternative sleeping space should be identified . Waking at night can occur for multiple reasons, and need not mean the end of sleep. Under such circumstances, it is important not to activate stress responses by remaining in bed and becoming increasingly frustrated by the failure to sleep. Some individuals find it useful to leave the bed, keep the lights low and engage in a relaxing activity such as reading or listening to music. Significantly, and as mentioned above, a single period of consolidated sleep (monophasic sleep) may not be the ‘universal state of sleep’, and could represent an artefact of a shortened night, and greatly compressed sleep. Biphasic sleep (sleeping during two periods interrupted by wake) or polyphasic sleep (multiple sleep/wake episodes) is the normal situation for most animals, and may have been for humans before the Industrial Revolution [184–186]. Although there is no universal agreement , the original concept that the natural state of human sleep is polyphasic was partly developed based upon human historical research [188,189], and therefore provides a good example of how historical studies can inform contemporary science. Indeed, laboratory-based studies subsequently supported the idea that human sleep is polyphasic [45,190]. This raises the important point, that if the natural state of human sleep is indeed polyphasic, then we need to re-think our interpretation of ‘disrupted sleep’ at night. Collectively, the data emerging suggest that if an individual wakes at night, then sleep is likely to return, if sleep is not sacrificed to social media and/or other alerting behaviours.
In table 3, different forms of CBTi are listed that have been used to help improve sleep. Nevertheless, it is important to stress that there is remarkable variation in sleep duration, timing and structure, not only between individuals but also within the same individual across the lifespan. This means that the individual has to identify what works best for them, and then defend those behaviours that promote optimal sleep.
7.2. Actions as an employer
In parallel with individual actions and CBTi, employers could implement measures in the workplace to help address some problems arising from SCRD. Potential employer responses are summarized in table 4.
Outlined in table 4 are some actions that employers could undertake to improve employee safety, health and welfare arising from work-related SCRD. For example, night shift work and extended working is associated with the loss of vigilance and a high frequency of micro-sleeps, and this can be dangerous both in the working environment and on the commute home (table 2). Driver fatigue has long been recognized as a major cause of road accidents . A recent study showed that 57% of junior doctors had either had a motor vehicle crash or near miss after working on the night shift . For this reason, some hospitals provide taxis for staff to get them home after the night shift. For many years, the rail industry has used some form of ‘Dead Man's Switch’ or Driver Safety Device to alert the driver that they have lost vigilance or fallen asleep, but such preventative measures have not been widely adopted in either domestic or commercial motor vehicles until recently. Part of the problem has been the lack of availability of non-invasive Driver Drowsiness Detection technology. However, in recent years a range of devices including, Steering pattern monitoring Vehicle position in lane monitoring and Driver eye/face monitoring to detect drowsiness have been developed and commercialized . Employers could make such devices available to employees who are at risk of fatigue or who undertake shiftwork and then commute home in their own vehicle. Perhaps the reason why some employers have not supplied such devices is that this might be seen as an admission of liability. However, not to do so could also constitute a failure of ‘duty of care’ by an employer, with serious legal consequences.
Loss of vigilance in the workplace could be improved by illuminating the working environment with light sufficiently bright to promote alertness. Although increasing light levels to the 1000 lux range seems to be useful, more studies are needed to define precisely when and how much light is needed in different settings  as long-term sleep loss and night shift work are associated with a range of physical and mental health problems (table 2), employers could offer higher frequency health-checks for those at-risk individuals to detect problems early so that the appropriate interventions can be implemented to prevent chronic health conditions developing. In the same way, knowing that metabolic abnormalities and cardiovascular diseases have a much higher prevalence in shift workers and the chronically tired (table 2), appropriate nutrition could be made available to help reduce the development of these conditions in the workplace, combined with educational advice regarding diet when not at work. Indeed, the development of educational materials by an independent body, such as the Department of Health in the UK or equivalent bodies in other countries, for employees, employee partners and family, explaining the impact and consequences of sleep loss could be of immense value in terms of developing coping strategies. Multiple studies have shown that the divorce rate is higher and that social interactions are more negative when one partner is involved in shiftwork . Part of the problem may be the failure of the partner to understand some of the negative impacts upon behaviour as a consequence of shift work or sleep loss. Again, the state, working with employers, could provide educational materials to support employees and those with whom they share their lives. Finally, there is considerable variation across the population in terms of chronotype. Studies have shown that the greater the mismatch between an individual's endogenous sleep/wake timing versus the time when an individual is required to work (often called social jet lag), the greater the risk of developing health problems (table 2) . Employers could attempt to match an individual's chronotype to specific work schedules. Put simply, the ‘larks’ would be better suited to the morning shifts and the ‘owls’ to the night shifts. Clearly, this is not the complete solution to shift work, but it could mitigate some of the significant problems of working against internal time.
This review has considered the biology of sleep and circadian rhythms, some of the consequences of disrupting these rhythms as a result of disease or societal pressures, and has outlined approaches to help mitigate a few of the problems associated with SCRD. It is disconcerting that although there has been a major increase in our understanding of the importance of sleep and of the consequences of sleep disruption, and a realization that SCRD dominates the lives of millions of individuals in both the developed and developing nations, there has not been commensurate action to deal with the problems. At an organizational level, we are not training our healthcare professionals in this critical area of bioscience governments have failed to address the broad issue of sleep health with appropriate legislation or the development of clear evidence-based guidelines and there remains a dearth of research information to allow government and employers to provide evidence-based and specific advice to the workforce regarding how to cope with work-related SCRD. At a broader level, and at least in the short-term, there is no ‘magic bullet’ for the impact of shift work or work-related sleep loss. Employers and employees have to accept that there will always be significant health consequences associated with sleep loss, and that currently, the best we can hope to achieve is a reduction in the severity of symptoms associated with SCRD. As a result, society needs to consider very seriously the circumstances where the consequences of SCRD are justified in the workplace. Such decisions must emerge from an evidence-based discussion involving academia, government, industry and above all the workforce, and hopefully, these discussions will take place before litigation distorts and derails the debate.
PBM Alters the Microbiome
We have shown in a previous study 153 that PBM, delivered as low-level laser, to the abdomen of healthy mice can produce a significant change in the gut microbiome. PBM significantly altered the microbial diversity of the microbiome, an effect most pronounced in mice treated three times per week with NIR light (808 nm), but not apparent with a single treatment with red light. PBM also produced a 10,000-fold increase in the proportion of the beneficial bacterium Allobaculum in the microbiota of mice after 14 days of treatment with NIR light but not with red light (Fig. 3).
FIG. 3. Change in the proportion of Allobaculum sp. in the total microbiota after PBM treatment with red and infrared laser. M, multiple (three times per week/2 weeks) dose of PBM S, single dose of PBM. (adapted from Bicknell et al. 153 ).
This study has recently been repeated (unpublished) with larger numbers of mice in the experimental groups (10 per treatment group). The wavelength was again shown to be an important parameter, with NIR wavelengths showing a more pronounced effect than red light, and the proportion of bacteria associated with a healthy microbiome in mice generally increased while the proportion of bacteria associated with a dysregulated microbiome generally decreased. Blivet and colleagues have also hypothesized that the microbiome (in mice) is important for the treatment of Alzheimer's disease with PBM 154 and have shown significant changes in the microbiome of mice injected with β-amyloid after treatment with a combination of PBM wavelengths and a static magnetic field (personal communication and 155 ). Recent preliminary work from our laboratory (unpublished) has also indicated that changes in the human (quasimetabolic syndrome) microbiome occur after treatment with PBM, including increases in Akkermansia muciniphila, Bifidobacterium sp., and Faecalibacterium sp., all recognized as correlated with a healthy microbiome, 156–158 and decreases in the Firmicutes:Bacteroides ratio, proposed as an indicator of gut health. 159,160
UV therapy of skin has been shown to affect the skin microbiome by altering barrier function, leading to microbial-specific skin-resident memory T cells, disrupting the healthy balance between skin microbiome and skin immune cells, and resulting in chronic inflammation and diseased skin. 161 On the other hand, UV irradiation of blood has been used for infections, autoimmune diseases, and some metabolic disorders. 162,163 The mechanisms of action are still uncertain despite many years of investigation.
Despite its importance for mating and nutrition, hormonal neuromodulation is a field that has received less attention. The OB is well-positioned for hormonal neuromodulation certain blood molecules can reach the OB more easily compared with other brain areas since the density of the blood capillary network, especially in the GL, is very high (Lecoq et al.) and the blood-brain barrier at the OB is more permeable (Ueno et al.). A specialized transport system for certain hormones provides an additional means to increase the local concentration of those hormones within the OB (Banks et al.).
Hormones have many diverse functions, e.g., sex steroids like testosterone or estradiol, that regulate sexual differentiation and behavior (McEwen and Milner 2017) neurohormones like melatonin, which affects circadian rhythms (Brown 1994) and metabolic hormones like ghrelin and insulin (Julliard et al.). Receptors for both estrogens (Hoyk et al., Maruska and Fernald) and melatonin (Corthell et al.) are expressed in the OB, and hormonal effects could be demonstrated (Corthell et al. Dillon et al.). However, the presence of synthesizing enzymes for these hormones within the OB (Corthell et al. Hoyk et al.) speaks rather for a local neuropeptidergic function.
Remotely produced hormones that act on OB cells have so far been linked to the metabolic regulation of food intake (see (Palouzier-Paulignan et al.). The olfactory system is known for its major contribution to the hedonic evaluation of food (with effects on food choices and consumption), and it seems to make sense that olfaction would be modulated according to foraging needs (Julliard et al.). Foraging influencing hormones are divided into orexigenic (appetite-stimulating) and anorexigenic (appetite-suppressing) hormones. So far, ghrelin and adiponectin as orexigenic molecules and insulin and leptin as anorexigenic molecules have been identified. These hormones have different sources (Fig. 1 d): ghrelin is produced primarily by the stomach (Kojima et al.), leptin is predominantly generated by adipose cells and enterocytes in the small intestine (Bado et al.), adiponectin is synthesized predominantly in adipose tissue (Scherer et al.), while insulin is released by pancreatic beta cells in response to feeding state in a glucose-dependent manner (Henquin 2011).
The best-investigated metabolic hormone with a function in the OB is insulin. The OB shows the highest insulin receptor (insulin kinase) density in the whole brain (Hill et al.) and insulin has been shown to cause an increase in firing frequency and inhibition of spike adaptation in OB MCs (Fadool et al.). As a substrate, the voltage-activated K+𠂜hannel Kv1.3 has been identified which, when phosphorylated by insulin receptor kinase, is causing a change in MC excitability (Fadool et al.). Adiponectine receptors have been found in all OB cell layers, and OB adiponectine injection was found to regulate the expression of insulin receptors (Miranda-Martinez et al.).
Ghrelin is transported across the blood-brain barrier and is present in high concentrations in the OB (Rhea et al.). So far only one ghrelin receptor has been identified, growth hormone secretagogue receptor (GHSR-1a) which is expressed in GL and MCL (Tong et al.). Functionally, ghrelin has been shown to increase exploratory sniffing behavior and olfactory sensitivity but it is unclear whether this effect is due to local ghrelin signaling.
The OB has also high levels of leptin receptors (Shioda et al.) but despite studies showing leptin decreasing olfactory sensitivity (Julliard et al.) and an increase in performance of leptin-deficient mice in olfactory detection (Getchell et al.) and memory tasks (Chelminski et al.), the cellular mechanisms of these changes remained unclear for a long time. Only recently it was shown that leptin decreases the excitability of MCs/TCs as well as GCs through direct modulation of a voltage‐sensitive potassium channel which leads to a net inhibition of the MTC population and negatively affects discrimination performance (Sun et al.).
As mentioned for ghrelin, it is not exactly clear if the orexigenic and anorexigenic effects of the hormones are caused by their effects in OB circuits or if changing the sense of smell is a secondary effect. Global developments of increased obesity and subsequent research in diet and metabolism will shed more light on this relationship.
Linking psychological with biological mechanisms of resilience
Although an extensive number of studies have documented the neurobiological circuitries mediating the stress response and reward experience, it remains a challenge to tease apart the exact biological systems and pathways that mediate and regulate the psychological building blocks (as described above) underlying resilience. This challenge is complicated by the different definitions of the resilience concept used in previous research [see e.g. 6 ], the different processes (such as sustainability and recovery of which the psychological and biological underpinnings are, at least partly, distinct from each other) of the concept resilience, and a lack of studies assessing both psychological and biological variables. However, the current state of research does support that the above-mentioned psychological resilience factors in the human studies are related to the stress and reward system of the brain. With respect to attachment, there is some evidence to suggest that experiences of trauma and stress during early life may result in a sensitized stress system, i.e. increased stress responses to smaller stressors such as minor stressors in daily life 123, 124 , and that individuals with secure attachment are less stress reactive in adulthood than those without. Similarly, findings from animal studies have suggested that maternal care programmes the offspring's stress responses by epigenetic regulation of gene expression regulating the HPA axis 23, 125 , persisting into adulthood. Similar to the observed alterations in methylation level of the promoter region of NR3C1 in the animal study, a recent study in humans investigated the methylation profile of the NR3C1 promoter region in postmortem hippocampus samples from suicide victims with a history of childhood abuse, and compared these to the methylation profile in samples from either suicide victims with no childhood abuse or control subjects. Consistent with the animal findings, abused suicide victims had increased CpG methylation of the NR3C1 promoter, concomitant with a decrease in NR3C1 gene expression 126 . Interestingly, the experience of positive emotions during everyday situations seemed to buffer against stress reactivity and against the genetic influence on stress reactivity 72, 127, 128 , and it is tempting to speculate that secure attachment may be important in the preprogramming of sensitivity of the reward system, buffering impact of stress systems when activated. The finding that religious practice like praying or remembering a religious experience activated areas of the reward system 118, 119 fits with the hypothesis that positive feelings mediate the process of resilience. One could argue that higher stress sensitivity drives the inability to experience positive feelings, or that stress sensitivity and positive feelings represent two extremes of one and the same continuum. However, a recent study showed that individuals who are stress sensitive in everyday life are not necessarily also low in daily life reward reactivity. In fact, these two phenotypes were not correlated and were not influenced by the same genetic and environmental factors (C. Lothmann, N. Jacobs, C. Derom, E. Thiery, J. Van Os, M. Wichers, personal communication) and thus do not represent the two extremes of a single continuum. This suggests that these traits can co-occur, and may be mediated via different mechanisms. People can be vulnerable in terms of their tendency to be stress reactive, but also protected from this vulnerability trait in the face of strong tendencies to experience positive emotions in daily life (i.e. from pleasant events or sense of meaning) which buffer stress, prevent future psychopathology and increase mental health. Thus, it seems that the experience of positive emotions has a distinct and more central role in resilience defined as the successful adaptation, swift recovery and psychological growth in the face and recovery phase after exposure to severe adversities, while the stress-response systems appears to mainly mediate vulnerability to stressors. Because the stress response and reward systems are closely related both in psychological as well as biological sense (see Fig. 2), it seems very interesting (and challenging) to explore the exact interrelations and crosstalk between psychological and biological factors of the stress response and the reward experience systems. It is therefore interesting that a PET imaging study in adult individuals showed that dopamine release under psychosocial stress in the ventral striatum (where dopaminergic neurons from the VTA project towards) was related to parental care during early life of these individuals 129 . More specifically, psychosocial stress caused a significant release of dopamine in the ventral striatum particularly in subjects reporting low parental care, suggesting that resilience to the psychosocial stressor was related to decreased firing of dopaminergic projections from VTA to the ventral striatum (including NAc) 129 . Likewise, animals susceptible to the effects of chronic social defeat displayed increased firing rates of dopaminergic VTA neurons, mediated by expression of the BDNF gene, whereas unsusceptible or ‘resilient’ animals displayed normal firing rate of dopaminergic VTA neurons, and no behavioural signs of anhedonia in the sucrose preference test (reflecting behaviour related to reward experience) 31 , suggesting a crucial role for sustainability of reward experience, mediated by mesolimbic dopaminergic neurotransmission and BDNF signalling, in ‘resilience’ to social defeat stress in animals. Other studies have further documented a role for epigenetic changes in the BDNF gene and risk for psychiatric disorders, for review see 130 . The serotonergic system has furthermore been connected to epigenetic mediation of experience during early life impacting on the stress response system, emotional processing in the brain and affective functioning 131, 132 . For example, recent studies on humans and macaque monkeys showed that higher methylation levels of the 5-HTT gene were associated with stronger effects of stress 133, 134 .
It has been proposed that a combination of various adverse environmental exposures throughout development (such as pre-and perinatal stress, low maternal care and childhood trauma) can sensitize the behaviour and central nervous system of an individual, thereby giving rise to a trajectory of risk for psychiatric disorder, starting with subclinical symptoms that become abnormally persistent when synergistically combined with further adversities. Evidence indeed suggests that certain environmental exposures may synergistically lead to subclinical symptoms and subsequent psychiatric disorders by impacting on the HHPA axis 135 and mesolimbic dopaminergic neurotransmission 136 while recent evidence suggests that sensitization to environmental exposures depends on epigenetic mechanisms 20, 137 . Together, these findings suggest that experience-dependent regulation of gene expression by the epigenetic machinery impacting on genes involved in the HHPA axis and the mesolimbic dopaminergic reward system underlies the psychological building blocks of resilience by mediating the response and enduring impact of stressors throughout life.
Current challenges and future perspectives
Despite considerable progress that resilience research has made during the last years, several issues challenge its current state. Progress has been hampered by the use of different definitions and different measures of resilience [for in-depth discussion see e.g. 6 ], and future studies should better specify the concept of resilience used in the study (e.g. attenuation of health disturbance, or enhanced adaptation and recovery) and how the measured variables of the concept related to these definitions. Resilience studies may thus profit from specifying the definition of resilience, and studying distinct aspects (e.g. recovery) of resilience by incorporating a range of measures that are thought to reflect different levels of resilience such as self-evaluations of functioning, questionnaires and interviews, behavioural and psychological phenotypes, physiological measures such as skin conductance, heart rate and blood pressure and (molecular) biological samples such as salivary cortisol and blood lymphocytes for gene expression and epigenetic analyses. Given difficulties in adequate measurement of psychiatric symptoms and psychological functioning in daily life with regularly used questionnaires and clinical interviews, it may be very productive to extend experience-sampling methodologies (as described above), which are able to capture fluctuations in psychological functioning in daily life in a prospective manner 138 .
Because individual trajectories of risk and resilience (Fig. 1) are difficult to capture cross-sectionally at a given moment, and because the available evidence strongly suggest a crucial role for exposures and experiences impacting on development and preadult life on resilience during adulthood, it will be very interesting for future research to prospectively investigate (e.g. birth) cohorts, and to assess the index individuals but also their siblings and parents using genetically sensitive designs. Prospective twin studies will be very informative in teasing apart the contributions of genetic factors, environmental factors and gene–environmental interactions. Given recent preliminary findings suggesting a crucial role of the epigenetic machinery in regulating adaptive responses to stress (described above), further research may establish the role of epigenetics in resilience, for example, by studying monozygotic twins discordant for resilience-related phenotypes.
To optimally align the translational aspects of human and animal studies on resilience, it will further be important to use (or design) behavioural and biological tests that can be conducted in both research settings. For example, measures of social approach and avoidance behaviour, generalization of anxiety, measures of mother–child relationships, sensitivity of the autonomous nervous system to a standardized stressor, or the cortisol/corticosterone response to a standardized stressor is possible in various animal species and may be very useful from a translational neuroscience perspective.
The field of experimental animal research may be particularly fruitful in elucidating the molecular and cellular mechanisms of resilience when extending its focus from mere investigations on the impact of a stressor by comparing exposed vs. non-exposed groups, to also studying differential susceptibility to a given stressor 139 as well as studying the rate of recovery of animals showing stress-related behavioural disturbances.
Recommendations for interventions aimed at increasing resilience in humans
The current literature review suggests that positive emotions are crucial to counteract stress experience. Feelings of positive emotions are strongly related to sense of meaning and life purpose. Interventions that successfully increase the experience of positive emotions have become available in western society 96, 97 . Meditation techniques such as Loving-kindness and mindfulness training may both increase feelings of purpose of life together with positive emotional experience. These effects have been established both at the psychological and the biological level as mental training through meditation has been shown to change brain function 140 . Also, ancient religious practices, such as praying, counting one's blessings and finding oneness with God contribute to sense of meaning and positive emotional experience 116, 118, 119 . Engaging in religious practices may thus actually have a positive influence on one's level of resilience. This fact should be further acknowledged and understood in current practices of mental health care to optimally support patients in their search for meaning.
To conclude, the current literature on resilience does show some converging evidence on links between psychological and biological aspects at the individual level although the field is expected to greatly benefit in the near future from multidisciplinary and translational-oriented research efforts.