How is Reality Constructed in the Brain?

How the brain allows us to see an animal, know what the animal is, where it is in relation to us, and what it looks like in comparison to other things in our world is a phenomenon that is confusing to many.

The neuroscience underlying this phenomenon suggests that we – or rather our brains – construct reality for us. This reality is often referred to by neuroscientists as a ‘hallucination’. This hallucination is then made accurate using our senses – mainly our sight and hearing.

Image Credit: sutadimages/Shutterstock.com

The Perceptual Process

The perceptual process begins with a distal stimulus – any physical object in the environment, for example, an apple. Our sensory receptors then receive information about the apple via a different type of environmental energy (e.g., light, sound waves, or chemicals), creating a representation of the distal stimulus called a proximal stimulus.

Our sensory receptors then transform the environmental physical energy into electrical energy in the nervous system. For example, receptors in the retina transform the light refracted off the object into electrical impulses. These electrical signals are then transmitted from one neuron to the next and are processed.

From this, a conscious sensory experience – perception – occurs. A split second after perception, recognition occurs in which we place the object we perceived in a category – e.g., we perceived an apple, which is a fruit.

How we process incoming sensory information is referred to by neuroscientists as ‘bottom-up processing’. Additionally, our existing knowledge, assumptions, and memories can influence perception and recognition. This is referred to by neuroscientists as ‘top-down processing’. Perception involves both bottom-up and top-down processing.

How is our Visual World Constructed in the Brain?

Light enters the eye and is detected by photoreceptors in the retina. Photoreceptors in the retina transform the light refracted off an object into electrical impulses. These are then transmitted to the lateral geniculate nucleus (LGN) in the hypothalamus, via the optic nerve. The LGN then sends signals to the primary visual cortex in the occipital lobe.

From the primary visual cortex information about the object, such as what it is, its location, and its color is transmitted to the higher visual cortices of the brain.

The identification and recognition of the object occurs via the ‘what pathway’. Signals go from the primary visual cortex and end up in the temporal lobe. The object’s location in space is perceived via the ‘where pathway’. Signals from the primary visual cortex end up in the parietal lobe.

Color is related to the wavelength of light that hits our eyes. Different objects absorb and reflect different wavelengths – objects themselves do not have color. How we perceive color is also determined by how bright and saturated the reflected light is.

The human retina has 3 cones – photoreceptors specialized for color vision – red, green, and blue. Each cone responds best to different wavelengths of light and encodes the light as an electrical signal. These signals are sent to the primary visual cortex and then to area V4 where recognition takes place.

So, when we see something as red, it is the label that our brain has attached to the signal associated with red.

For some individuals, for example, those with colorblindness, color is experienced very differently from ‘normal’.

The most common form of colorblindness is deuteranopia (red-green colorblindness) which occurs due to the absence of the green cone. Individuals with deuteranopia have difficulty distinguishing between red and green but can differentiate between light and dark versions of each shade.

How do we Make Sense of What We Hear?

Sound occurs from a change in air pressure. Sensory receptors in the inner ear transform the sounds we hear into electrical impulses. These are transmitted from neuron to neuron and are taken to the cochlear nucleus in the medulla. From the cochlear nucleus, auditory information is taken to the medial geniculate nucleus, then to the primary auditory cortex in the temporal lobe.

How we perceive sounds depends on the pitch and loudness of the sound and the location from which the sound is coming. The pitch of a sound depends on the frequency of the sound waves, and the loudness depends on the amplitude of the sound waves.

How loud we perceive a sound to be is affected by the pitch of the sound – sounds with a low pitch need to have a higher amplitude to be perceived equally as loud as a sound with a higher pitch.

This Information from the primary auditory cortex is then sent to higher auditory cortices where recognition of the sound and where the sound was produced in space occurs.

Synesthesia and Agnosia

The brain of some individuals constructs reality in a very different way than others. From experiencing phenomena like tasting numbers to not recognizing something as common as an apple, the reality is not the same for everyone.

Synesthesia is a stimulation of a particular type that always leads to another perceptual experience. Examples of synesthesia include tasting shapes and hearing colors. Experts suspect that approximately 1 in 300 people are synesthetes.

Agnosia is the inability to recognize objects, people, or sounds. Visual form agnosia is the inability to recognize objects. Research suggests that it occurs because of damage to the ‘what’ pathway in the temporal lobe.  Individuals with visual form agnosia cannot name objects despite knowing their features.

In A Nutshell, Our Brain Models The World for Us

Perception is the brain’s search for the best interpretation of the stimuli that are presented to us. What we believe to see and hear from the world is essentially modeled by the brain. This model the brain creates is made accurate by our sight and hearing

For some individuals, the model their brain creates is largely different from what is ‘normal’. Individuals with synesthesia arguably perceive ‘too much’, whereas individuals with agnosia and colorblindness perceive very little.

Through something as simple as looking around, our brain – almost like magic – uses our past experiences and connects them with our current situation to construct our reality as we know it.

Image Credit: Jaromir Chalabala/Shutterstock.com

References:

  • Armstrong, R. A., & Cubbidge, R. C. (2019, January 1). 1 – The Eye and Vision: An Overview (V. R. Preedy & R. R. Watson, Eds.). ScienceDirect; Academic Press. https://www.sciencedirect.com/science/article/pii/B9780128152454000016
  • Carpenter, S. (2001). Everyday fantasia: The world of synesthesia. Https://Www.apa.org. https://www.apa.org/monitor/mar01/synesthesia#:~:text=Research%20suggests%20that%20about%20one
  • Milner, A. D., Perrett, D. I., Johnston, R. S., Benson, P. J., Jordan, T. R., Heeley, D. W., Bettucci, D., Mortara, F., Mutani, R., Terazzi, E., & Davidson, D. L. W. (1991). PERCEPTION AND ACTION IN “VISUAL FORM AGNOSIA.” Brain, 114(1), 405–428. https://doi.org/10.1093/brain/114.1.405
  • Shamma, S. A., & Micheyl, C. (2010). Behind the scenes of auditory perception. Current Opinion in Neurobiology, 20(3), 361–366. https://doi.org/10.1016/j.conb.2010.03.009
  • Simner, J. (2011). Defining synaesthesia. British Journal of Psychology, 103(1), 1–15. https://doi.org/10.1348/000712610×528305

Further Reading

  • All Brain Content
  • Human Brain Structure
  • The Impact of Climate Change on Brain Health
  • Language and the Human Brain
  • An Overview of Brain Development
More…

Last Updated: Apr 14, 2022

Written by

Joelle Hanson-Baiden

Joelle completed her Bachelor of Science degree in Cognitive Neuroscience and Psychology at The University of Manchester in 2021. Prior to this, Joelle completed a Biosciences Foundation Year at The University of Manchester in 2018.

Source: Read Full Article