We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Some context, before the question:
Whenever I have a craving to binge on something sugary, I just prepare a cup of extremely bitter green tea (with 3 bags of tea) and I imagine myself binging on something sugary, but at the same time I take a gulp of this bitter stuff and let that sit in my mouth for a few seconds.
The very next moment, I immediately notice that the craving comes down and after a few more sips the craving completely goes away. I feel nauseous if I think about something sugary for another day.
Why and how does this work? Even though I consciously know that I am fooling myself, with a false sensory perception, why does my brain associate sugary treat to be revolting and poisonous(cause of nausea)?
These are learning phenomena you describe. I'll try to explain a simple way to think about this.
By default, sweet foods are appetitive and, for instance, strongly bitter foods are aversive. However, it is possible to condition yourself to associate a stimulus, no matter how it presents originally, with a different valence (appetitive or aversive).
There are several well understood forms conditioning can take. I list a few here, and some subtypes. They are described well on Wikipedia:
-- Serial application of classical conditioning can be called second-order conditioning.
-- You can also condition yourself through self-evaluation, called evaluative conditioning
Covert conditioning is a technique more than a type of conditioning, it may directly apply to your question. It involves the use of imagery.
Operant and classical conditioning are the two major paradigms to know. Classical or Pavlovian conditioning involves pairing an "unconditioned" stimulus with a neutral "conditioned" stimulus. Operant conditioning involves providing a feedback, usually a reward or punishment, to reinforce learning.
You can self-learn or condition yourself. This does not mean necessarily it is intentional or conscious on your part, often things can be subconscious or unconscious. For instance, if you associate eating crisps with watching a movie, watching movies without crisps can become uneasy, in spite of the fact that watching a movie was originally a pleasurable activity. In a neuroscience seminar involving conditioning in fruit flies, I once recorded a saying which captured the spirit of the experiment quite well: An expected reward, not experienced, can feel like a punishment, and vice versa. Rewarding yourself after doing chores can also be a easy and practicable way to build good habits with doing chores.
Overview of the Five Senses
The ways we understand and perceive the world around us as humans are known as senses. We have five traditional senses known as taste, smell, touch, hearing, and sight. The stimuli from each sensing organ in the body are relayed to different parts of the brain through various pathways. Sensory information is transmitted from the peripheral nervous system to the central nervous system. A structure of the brain called the thalamus receives most sensory signals and passes them along to the appropriate area of the cerebral cortex to be processed. Sensory information regarding smell, however, is sent directly to the olfactory bulb and not to the thalamus. Visual information is processed in the visual cortex of the occipital lobe, sound is processed in the auditory cortex of the temporal lobe, smells are processed in the olfactory cortex of the temporal lobe, touch sensations are processed in the somatosensory cortex of the parietal lobe, and taste is processed in the gustatory cortex in the parietal lobe.
The limbic system is composed of a group of brain structures that play a vital role in sensory perception, sensory interpretation, and motor function. The amygdala, for example, receives sensory signals from the thalamus and uses the information in the processing of emotions such as fear, anger, and pleasure. It also determines what memories are stored and where the memories are stored in the brain. The hippocampus is important in forming new memories and connecting emotions and senses, such as smell and sound, to memories. The hypothalamus helps regulate emotional responses elicited by sensory information through the release of hormones that act on the pituitary gland in response to stress. The olfactory cortex receives signals from the olfactory bulb for processing and identifying odors. In all, limbic system structures take information perceived from the five senses, as well as other sensory information (temperature, balance, pain, etc.) to make sense of the world around us
Long-Term Memory and the Senses
While sensory memory usually refers to memory that immediately and briefly follows perception, sensory impressions can leave traces in memory that last for years. The forms of memory related to these senses are typically described in terms of which sense they reflect.
Visual-spatial memory captures details about where visible things are located relative to each other. Auditory memory, olfactory memory, and haptic memory are terms for stored sensory impressions of sounds, smells, and skin sensations, respectively. We can, of course, remember and recognize tastes as well.
Why are long-term memories of sensory details important?
The impressions of sensory experiences that survive in long-term memory enable people to accomplish the critical task of identification—of people (by their faces or the sounds of their voices), objects, symbols, and anything we can distinguish using the senses. Spatial memory, which includes memory for the appearance of places and for routes between different places (among other kinds of information), provides a foundation for navigating through the environment.
How do these memories relate to other kinds of memory?
Memories of scenes, faces, sounds, smells, physical feelings, and other phenomena are a key part of episodic memory, the mental record of personal experiences. Memory related to sensory experience can be a meaningful part of autobiographical (self-focused) memory, as when a familiar scent suddenly recalls a related childhood experience. And sense-based information, including images that represent abstract concepts (like “cat” or “dog”) are connected to semantic memory, or one’s knowledge about the world.
Vision is the ability to detect light patterns from the outside environment and interpret them into images. Animals are bombarded with sensory information, and the sheer volume of visual information can be problematic. Fortunately, the visual systems of species have evolved to attend to the most-important stimuli. The importance of vision to humans is further substantiated by the fact that about one-third of the human cerebral cortex is dedicated to analyzing and perceiving visual information.
As with auditory stimuli, light travels in waves. The compression waves that compose sound must travel in a medium—a gas, a liquid, or a solid. In contrast, light is composed of electromagnetic waves and needs no medium light can travel in a vacuum ([link]). The behavior of light can be discussed in terms of the behavior of waves and also in terms of the behavior of the fundamental unit of light—a packet of electromagnetic radiation called a photon. A glance at the electromagnetic spectrum shows that visible light for humans is just a small slice of the entire spectrum, which includes radiation that we cannot see as light because it is below the frequency of visible red light and above the frequency of visible violet light.
Certain variables are important when discussing perception of light. Wavelength (which varies inversely with frequency) manifests itself as hue. Light at the red end of the visible spectrum has longer wavelengths (and is lower frequency), while light at the violet end has shorter wavelengths (and is higher frequency). The wavelength of light is expressed in nanometers (nm) one nanometer is one billionth of a meter. Humans perceive light that ranges between approximately 380 nm and 740 nm. Some other animals, though, can detect wavelengths outside of the human range. For example, bees see near-ultraviolet light in order to locate nectar guides on flowers, and some non-avian reptiles sense infrared light (heat that prey gives off).
Wave amplitude is perceived as luminous intensity, or brightness. The standard unit of intensity of light is the candela, which is approximately the luminous intensity of a one common candle.
Light waves travel 299,792 km per second in a vacuum, (and somewhat slower in various media such as air and water), and those waves arrive at the eye as long (red), medium (green), and short (blue) waves. What is termed “white light” is light that is perceived as white by the human eye. This effect is produced by light that stimulates equally the color receptors in the human eye. The apparent color of an object is the color (or colors) that the object reflects. Thus a red object reflects the red wavelengths in mixed (white) light and absorbs all other wavelengths of light.
Anatomy of the Eye
The photoreceptive cells of the eye, where transduction of light to nervous impulses occurs, are located in the retina (shown in [link]) on the inner surface of the back of the eye. But light does not impinge on the retina unaltered. It passes through other layers that process it so that it can be interpreted by the retina ([link]b). The cornea, the front transparent layer of the eye, and the crystalline lens, a transparent convex structure behind the cornea, both refract (bend) light to focus the image on the retina. The iris, which is conspicuous as the colored part of the eye, is a circular muscular ring lying between the lens and cornea that regulates the amount of light entering the eye. In conditions of high ambient light, the iris contracts, reducing the size of the pupil at its center. In conditions of low light, the iris relaxes and the pupil enlarges.
Which of the following statements about the human eye is false?
- Rods detect color, while cones detect only shades of gray.
- When light enters the retina, it passes the ganglion cells and bipolar cells before reaching photoreceptors at the rear of the eye.
- The iris adjusts the amount of light coming into the eye.
- The cornea is a protective layer on the front of the eye.
The main function of the lens is to focus light on the retina and fovea centralis. The lens is dynamic, focusing and re-focusing light as the eye rests on near and far objects in the visual field. The lens is operated by muscles that stretch it flat or allow it to thicken, changing the focal length of light coming through it to focus it sharply on the retina. With age comes the loss of the flexibility of the lens, and a form of farsightedness called presbyopia results. Presbyopia occurs because the image focuses behind the retina. Presbyopia is a deficit similar to a different type of farsightedness called hyperopia caused by an eyeball that is too short. For both defects, images in the distance are clear but images nearby are blurry. Myopia (nearsightedness) occurs when an eyeball is elongated and the image focus falls in front of the retina. In this case, images in the distance are blurry but images nearby are clear.
There are two types of photoreceptors in the retina: rods and cones, named for their general appearance as illustrated in [link]. Rods are strongly photosensitive and are located in the outer edges of the retina. They detect dim light and are used primarily for peripheral and nighttime vision. Cones are weakly photosensitive and are located near the center of the retina. They respond to bright light, and their primary role is in daytime, color vision.
The fovea is the region in the center back of the eye that is responsible for acute vision. The fovea has a high density of cones. When you bring your gaze to an object to examine it intently in bright light, the eyes orient so that the object’s image falls on the fovea. However, when looking at a star in the night sky or other object in dim light, the object can be better viewed by the peripheral vision because it is the rods at the edges of the retina, rather than the cones at the center, that operate better in low light. In humans, cones far outnumber rods in the fovea.
Review the anatomical structure of the eye, clicking on each part to practice identification.
Transduction of Light
The rods and cones are the site of transduction of light to a neural signal. Both rods and cones contain photopigments. In vertebrates, the main photopigment, rhodopsin, has two main parts [link]): an opsin, which is a membrane protein (in the form of a cluster of α-helices that span the membrane), and retinal—a molecule that absorbs light. When light hits a photoreceptor, it causes a shape change in the retinal, altering its structure from a bent (cis) form of the molecule to its linear (trans) isomer. This isomerization of retinal activates the rhodopsin, starting a cascade of events that ends with the closing of Na + channels in the membrane of the photoreceptor. Thus, unlike most other sensory neurons (which become depolarized by exposure to a stimulus) visual receptors become hyperpolarized and thus driven away from threshold ([link]).
There are three types of cones (with different photopsins), and they differ in the wavelength to which they are most responsive, as shown in [link]. Some cones are maximally responsive to short light waves of 420 nm, so they are called S cones (“S” for “short”) others respond maximally to waves of 530 nm (M cones, for “medium”) a third group responds maximally to light of longer wavelengths, at 560 nm (L, or “long” cones). With only one type of cone, color vision would not be possible, and a two-cone (dichromatic) system has limitations. Primates use a three-cone (trichromatic) system, resulting in full color vision.
The color we perceive is a result of the ratio of activity of our three types of cones. The colors of the visual spectrum, running from long-wavelength light to short, are red (700 nm), orange (600 nm), yellow (565 nm), green (497 nm), blue (470 nm), indigo (450 nm), and violet (425 nm). Humans have very sensitive perception of color and can distinguish about 500 levels of brightness, 200 different hues, and 20 steps of saturation, or about 2 million distinct colors.
Visual signals leave the cones and rods, travel to the bipolar cells, and then to ganglion cells. A large degree of processing of visual information occurs in the retina itself, before visual information is sent to the brain.
Photoreceptors in the retina continuously undergo tonic activity. That is, they are always slightly active even when not stimulated by light. In neurons that exhibit tonic activity, the absence of stimuli maintains a firing rate at a baseline while some stimuli increase firing rate from the baseline, and other stimuli decrease firing rate. In the absence of light, the bipolar neurons that connect rods and cones to ganglion cells are continuously and actively inhibited by the rods and cones. Exposure of the retina to light hyperpolarizes the rods and cones and removes their inhibition of bipolar cells. The now active bipolar cells in turn stimulate the ganglion cells, which send action potentials along their axons (which leave the eye as the optic nerve). Thus, the visual system relies on change in retinal activity, rather than the absence or presence of activity, to encode visual signals for the brain. Sometimes horizontal cells carry signals from one rod or cone to other photoreceptors and to several bipolar cells. When a rod or cone stimulates a horizontal cell, the horizontal cell inhibits more distant photoreceptors and bipolar cells, creating lateral inhibition. This inhibition sharpens edges and enhances contrast in the images by making regions receiving light appear lighter and dark surroundings appear darker. Amacrine cells can distribute information from one bipolar cell to many ganglion cells.
You can demonstrate this using an easy demonstration to “trick” your retina and brain about the colors you are observing in your visual field. Look fixedly at [link] for about 45 seconds. Then quickly shift your gaze to a sheet of blank white paper or a white wall. You should see an afterimage of the Norwegian flag in its correct colors. At this point, close your eyes for a moment, then reopen them, looking again at the white paper or wall the afterimage of the flag should continue to appear as red, white, and blue. What causes this? According to an explanation called opponent process theory, as you gazed fixedly at the green, black, and yellow flag, your retinal ganglion cells that respond positively to green, black, and yellow increased their firing dramatically. When you shifted your gaze to the neutral white ground, these ganglion cells abruptly decreased their activity and the brain interpreted this abrupt downshift as if the ganglion cells were responding now to their “opponent” colors: red, white, and blue, respectively, in the visual field. Once the ganglion cells return to their baseline activity state, the false perception of color will disappear.
The myelinated axons of ganglion cells make up the optic nerves. Within the nerves, different axons carry different qualities of the visual signal. Some axons constitute the magnocellular (big cell) pathway, which carries information about form, movement, depth, and differences in brightness. Other axons constitute the parvocellular (small cell) pathway, which carries information on color and fine detail. Some visual information projects directly back into the brain, while other information crosses to the opposite side of the brain. This crossing of optical pathways produces the distinctive optic chiasma (Greek, for “crossing”) found at the base of the brain and allows us to coordinate information from both eyes.
Once in the brain, visual information is processed in several places, and its routes reflect the complexity and importance of visual information to humans and other animals. One route takes the signals to the thalamus, which serves as the routing station for all incoming sensory impulses except olfaction. In the thalamus, the magnocellular and parvocellular distinctions remain intact, and there are different layers of the thalamus dedicated to each. When visual signals leave the thalamus, they travel to the primary visual cortex at the rear of the brain. From the visual cortex, the visual signals travel in two directions. One stream that projects to the parietal lobe, in the side of the brain, carries magnocellular (“where”) information. A second stream projects to the temporal lobe and carries both magnocellular (“where”) and parvocellular (“what”) information.
Another important visual route is a pathway from the retina to the superior colliculus in the midbrain, where eye movements are coordinated and integrated with auditory information. Finally, there is the pathway from the retina to the suprachiasmatic nucleus (SCN) of the hypothalamus. The SCN is a cluster of cells that is considered to be the body’s internal clock, which controls our circadian (day-long) cycle. The SCN sends information to the pineal gland, which is important in sleep/wake patterns and annual cycles.
View this interactive presentation to review what you have learned about how vision functions.
Vision is the only photo responsive sense. Visible light travels in waves and is a very small slice of the electromagnetic radiation spectrum. Light waves differ based on their frequency (wavelength = hue) and amplitude (intensity = brightness).
In the vertebrate retina, there are two types of light receptors (photoreceptors): cones and rods. Cones, which are the source of color vision, exist in three forms—L, M, and S—and they are differentially sensitive to different wavelengths. Cones are located in the retina, along with the dim-light, achromatic receptors (rods). Cones are found in the fovea, the central region of the retina, whereas rods are found in the peripheral regions of the retina.
Visual signals travel from the eye over the axons of retinal ganglion cells, which make up the optic nerves. Ganglion cells come in several versions. Some ganglion cell axons carry information on form, movement, depth, and brightness, while other axons carry information on color and fine detail. Visual information is sent to the superior colliculi in the midbrain, where coordination of eye movements and integration of auditory information takes place. Visual information is also sent to the suprachiasmatic nucleus (SCN) of the hypothalamus, which plays a role in the circadian cycle.
[link] Which of the following statements about the human eye is false?
- Rods detect color, while cones detect only shades of gray.
- When light enters the retina, it passes the ganglion cells and bipolar cells before reaching photoreceptors at the rear of the eye.
- The iris adjusts the amount of light coming into the eye.
- The cornea is a protective layer on the front of the eye.
Why do people over 55 often need reading glasses?
- Their cornea no longer focuses correctly.
- Their lens no longer focuses correctly.
- Their eyeball has elongated with age, causing images to focus in front of their retina.
- Their retina has thinned with age, making vision more difficult.
Why is it easier to see images at night using peripheral, rather than the central, vision?
- Cones are denser in the periphery of the retina.
- Bipolar cells are denser in the periphery of the retina.
- Rods are denser in the periphery of the retina.
- The optic nerve exits at the periphery of the retina.
A person catching a ball must coordinate her head and eyes. What part of the brain is helping to do this?
How could the pineal gland, the brain structure that plays a role in annual cycles, use visual information from the suprachiasmatic nucleus of the hypothalamus?
The pineal gland could use length-of-day information to determine the time of year, for example. Day length is shorter in the winter than it is in the summer. For many animals and plants, photoperiod cues them to reproduce at a certain time of year.
How is the relationship between photoreceptors and bipolar cells different from other sensory receptors and adjacent cells?
The photoreceptors tonically inhibit the bipolar cells, and stimulation of the receptors turns this inhibition off, activating the bipolar cells.
Our editors will review what you’ve submitted and determine whether to revise the article.
Proprioception, the perception by an animal of stimuli relating to its own position, posture, equilibrium, or internal condition.
The coordination of movements requires continuous awareness of the position of each limb. The receptors in the skeletal (striated) muscles and on the surfaces of tendons of vertebrates provide constant information on the positions of limbs and the action of muscles. Comparable organs of arthropods (e.g., insects, crustaceans) include stretch receptors located on the outsides of muscles and chordotonal organs (special nerves that measure tension changes) within the joints. Awareness of limb position and movements is also gained through the stimulation of sensitive hairs at the joints.
The awareness of equilibrium changes usually involves the perception of gravity. The organ for such perception most frequently found in invertebrates is the statocyst, a fluid-filled chamber lined with sensitive hairs and containing one or more tiny, stonelike grains ( statoliths). The statoliths may be free-moving, as in most mollusks, or loosely fixed to the sense hairs, as in some crustaceans. Statocysts are also found in many cnidarians and worms. Comparable organs in vertebrates are the saccule and utricle of the ear, the grains being called otoliths. In either case, a change in the animal’s position or orientation is conveyed to the sense hairs by the pressure of the statoliths.
A third type of proprioceptor, found in all vertebrates and some invertebrates (e.g., cephalopods, crustaceans), informs the animal of body rotations. The crustacean organ detects changes in the inertia of fluid in a cavity, into which slender sensory hairs project. Rotation of the animal causes the stimulation of the hairs because of the inertial lag of the fluid.
Vertebrates are able to sense rotation by the inertial lag of fluid in the semicircular canals of the ear, acting on sensory hairs. The three canals form loops lying in planes at right angles to each other by integrating signals from the canals, the central nervous system can detect rotation in planes other than those of the canals.
This article was most recently revised and updated by Kara Rogers, Senior Editor.
Density of Mechanoreceptors
The distribution of touch receptors in human skin is not consistent over the body. In humans, touch receptors are less dense in skin covered with any type of hair, such as the arms, legs, torso, and face. Touch receptors are denser in glabrous skin (the type found on human fingertips and lips, for example), which is typically more sensitive and is thicker than hairy skin (4 to 5 mm versus 2 to 3 mm).
How is receptor density estimated in a human subject? The relative density of pressure receptors in different locations on the body can be demonstrated experimentally using a two-point discrimination test. In this demonstration, two sharp points, such as two thumbtacks, are brought into contact with the subject’s skin (though not hard enough to cause pain or break the skin). The subject reports if he or she feels one point or two points. If the two points are felt as one point, it can be inferred that the two points are both in the receptive field of a single sensory receptor. If two points are felt as two separate points, each is in the receptive field of two separate sensory receptors. The points could then be moved closer and re-tested until the subject reports feeling only one point, and the size of the receptive field of a single receptor could be estimated from that distance.
Short-term memory is that brief period of time where you can recall information you were just exposed to. Short-term often encompasses anywhere from 30 seconds to a few days, depending on who is using the term.
Some researchers use the term working memory and distinguish it from short-term memory, though the two overlap. Working memory can be defined as the ability of our brains to keep a limited amount of information available long enough to use it. Working memory helps process thoughts and plans, as well as carries out ideas.
You can think of working memory as your short-term memory combining strategies and knowledge from your long-term memory bank to assist in making a decision or calculation.
Working memory has been connected to executive functioning, which is often affected in the earlier stages of Alzheimer's disease.
PLANNING AND TEACHING LAB ACTIVITIES
First, prepare students for lab activities by giving background information according to your teaching practices (e.g., lecture, discussion, handouts, models). Because students have no way of discovering sensory receptors or nerve pathways for themselves, they need some basic anatomical and physiological information. Teachers may choose the degree of detail and the methods of presenting the auditory system based on grade level and time available.
Offer students the chance to create their own experiments
While students do need direction and practice to become good laboratory scientists, they also need to learn how to ask and investigate questions that they generate themselves. Science classrooms that offer only guided activities with a single "right" answer do not help students learn to formulate questions, think critically, and solve problems. Because students are naturally curious, incorporating student investigations into the classroom is a logical step after they have some experience with a system.
The "Try Your Own Experiment" section of this unit (see the accompanying Teacher and Student Guides) offers students an opportunity to direct some of their own learning after a control system has been established in the "Class Experiment." Because students are personally vested in this type of experience, they tend to remember both the science processes and concepts from these laboratories.
Use "Explore Time" before experimenting
To encourage student participation in planning and conducting experiments, first provide Explore Time or Brainstorming Time. Because of their curiosity, students usually "play" with lab materials first even in a more traditional lab, so taking advantage of this natural behavior is usually successful. Explore Time can occur either before the Class Experiment or before the "Try Your Own Experiment" activity, depending on the nature of the concepts under study.
Explore before the Class Experiment
To use Explore Time before the Class Experiment, set the lab supplies out on a bench before giving instructions for the experiment. Ask the students how these materials could be used to investigate the sense of touch in light of the previous lecture and discussion, then offer about 10 minutes for investigating the materials. Give some basic safety precautions, then circulate among students to answer questions and encourage questions. After students gain an interest in the materials and subject, lead the class into the Class Experiment with the Teacher Demonstration and help them to formulate the Lab Question. Wait until this point to hand out the Student Guide, so students have a chance to think creatively. (See the accompanying Guides.)
Explore before "Try Your Own Experiment"
REFERENCES and SUGGESTED READING
- Bellamy, M.L. and Frame, K. (Eds.) (1996). Neuroscience Laboratory and Classroom Activities. National Association of Biology Teachers and the Society for Neuroscience, pp. 113-136.
What is a Sensory Neuron? (with pictures)
A sensory neuron is a nervous system cell that is involved in the transportation of sensory neural impulses from receptors or sensory organs throughout the body. These neural impulses are sent to the brain and translated into an understandable form so that the organism can react to the stimuli. Such understandable forms include sensations of pain, heat, texture, and visual input. The proper reception of such stimuli is crucial to the survival of most organisms, as it keeps them informed of the world around them and allows them to respond accordingly.
A neuron is a cell that is specialized to carry neural information throughout the body as such, it differs greatly from most cells. Structures known as dendrites are at one end of the nerve cell these receive signals from other neurons or sources of sensory information. They are connected to the cell body, which contains the nucleus and other essential organelles that sustain the cell. The axon extends outward from the cell body toward wherever it needs to carry its sensory information the longest axons in human cells can sometimes exceed 3.2 feet (1 meter) in length. The axon terminates at the axon terminal, which passes on the neural information to where it is needed.
A sensory neuron generally transmits its information toward the central nervous system, which is primarily contained in the brain and in parts of the spine. Sensory input, then, is received by the dendrites of the nerve cell and sent through the axon until it either reaches another and passes the signal off or it reaches its destination. Other kinds of cells have limited involvement in this process, making neurons the primary functional part of the nervous system.
There are three primary types of neurons: afferent, efferent, and interneurons. Those that transmit sensory information are afferent neurons, meaning that they take information from sensory organs or tissues and communicate it to the brain. Efferent neurons carry impulses from the central nervous system to other parts of the body and most notably include motor neurons. Interneurons simply connect other neurons, allowing them to reach their destinations in the most effective way possible.
Sensory neurons do not always send their information to the brain, though they typically do in complex organisms, such as humans. In a simple organism lacking a complex central nervous system, they may simply send their information directly to a motor neuron. This allows for a rapid reaction without intensive processing of stimuli.
Sensory integration is the process by which we receive information through our senses, organize this information, and use it to participate in everyday activities.
An example of sensory integration is:
- Baby smelling food as they bring it to their mouth
- Tasting the food
- Feeling the texture of the food
- Determining what this food is and if they want more
Did You Know There Are 7 Senses?
You read that right! Most people think there are just 5 senses, but there are actually 7! So what are the 7 senses?
- Sight (Vision)
- Hearing (Auditory)
- Smell (Olfactory)
- Taste (Gustatory)
- Touch (Tactile) (Movement): the movement and balance sense, which gives us information about where our head and body are in space. Helps us stay upright when we sit, stand, and walk.
- Proprioception (Body Position): the body awareness sense, which tells us where our body parts are relative to each other. It also gives us information about how much force to use, allowing us to do something like crack an egg while not crushing the egg in our hands.
So how does this all come together? Here&rsquos an example of sensory integration while playing baseball:
Imagine you&rsquore playing baseball and you&rsquore up to bat. You use your vestibular sense to take your batting stance, and proprioception to sense where your hands are, where your feet are, and how you should swing to make contact with the ball. You then see the ball come closer to you and you swing. You hear the ball crack against the bat and you know you&rsquove hit it, so you begin to run! You continue to listen and look as you see the other players scrambling to get the ball and tag you out. You can see you&rsquore getting closer to first base, but so is the ball, so you decide to slide. As you slide, you balance your body extend your arms because you&rsquore aware of their position and that they will reach the base first and feel to know when the base is against your fingertips. It may be a little unpleasant to taste and smell the dirt as you slide, but your senses confirm you&rsquove made it!
When does sensory development begin?
Very early on! In fact, some sensory development, like sense of smell, begins in utero.
What sensory integration milestones should my child be reaching?
As your child grows and develops, they should achieve new sensory milestones. From visual tracking, to reaching for new toys, to putting objects in their mouth (yes, that&rsquos typical!), baby will keep engaging their senses to learn about the world around them.
Learn all of the sensory milestones through 18 months here!
Watch some of these early sensory milestones, which baby will reach around 0-3 months:
Keeping their senses engaged through everyday activities
We&rsquove got some great ideas for your little one, from birth to 18 months.
What to Watch For
Some children have difficulties receiving and processing incoming sensations. Some signs of a sensory issue include:
- Overly sensitive or under reactive to touch, movement, sights, or sounds
- Unusually high or low activity level
- Easily distracted poor attention to tasks
- Delays in speech, motor skills, or academic achievement
- Coordination problems appears clumsy or awkward
- Poor body awareness
- Difficulty learning new tasks or figuring out how to play with unfamiliar toys
- Difficulty with tasks that require using both hands at the same time
- Appears to be disorganized most of the time
- Difficulty with transitions between activities or environments
- Immature social skills
- Impulsivity or lack of self-control
- Difficulty calming self once &ldquowound up&rdquo
What to do if you suspect a delay
Each child reacts to sensory information differently. Sensory issues are very complex because a child's sensory system could be a mixture of over reactive, under reactive, or actively engaged.
If you suspect an issue, contact a healthcare provider to share your concerns. Everyday tasks can become difficult for a child who processes sensory information differently, so it&rsquos best to connect with a professional who can help you understand your child&rsquos sensory integration.
Mailloux Z & Smith Roley S. Sensory Integration Development and Early Signs of Difficulties. July 2013.