Brain’s Anatomy – What Part Of The Brain Controls Speech?

Human conversation relies mostly on the faculty of speech, supplemented by the manufacturing of certain sounds, every of which is distinct in meaning. The production and reception of these sounds require a specific functioning ear and auditory system, as excellent as intact and healthful vocal and sound-generating structures, which consist of the larynx, the tongue, and the lips. But we still have no idea what part of the brain controls speech.

In general, the hemisphere or side of the brain is in charge of language and speech. Thanks to this, it has been called the dominant hemisphere. The right hemisphere plays an enormous part in interpreting visual information and spatial processing.

In about one-third of left-handed individuals, speech function could even be located on the right side of the brain. Left-handed people may have specialized tests to determine whether your speech center is on the left or right before any surgical treatment. Recent studies[1] have publicized that in about 97% of people, language is characterized within the hemisphere. 

However, in approximately 19% of left-handed people, the parts in charge of language are within the right brain and as many as 68% of them have some language capacities in both the left and thus the right hemispheres. In recent decades, there has been an explosion of research into language processing within the area of brain. It’s now generally accepted that the control of speech is a component of a complex network within the brain.

What part of the brain controls speech 

Speech production is a complex process, involving a networked system of brain areas that each contribute in unique ways. Areas beyond Broca’s area[2]and the anterior insula have been implicated in the complex process of producing speech movements. The ability to provide accurate speech sounds in rapid succession is a few things humans deem for granted. Speaking is a particularly involved process.

Speech and other language abilities are lateralized brain functions, meaning they’re all located on one side of the brain. For many people, the cerebral hemisphere[3] controls language. According to Shippensburg University[4], the lateralized hemisphere varies on the person’s dominant hand. 97 percent of right-handed people have hemisphere language areas, while 19 percent of left-handed people have right-brain language areas, and a further 68 percent of left-handed persons have language areas in both parts. If an injury occurs to at least one of the language areas within the area of the brain, the patient can have difficulties producing speech.

Speech is the verbal means of communicating. Speech consists of the following:

Articulation: How speech sounds are made

Voice: The utilization of the vocal folds and breathing to make a sound (e.g., hoarseness, breathiness, projection)

Fluency and prosody: The rhythm, intonation, stress, and related attributes of speech

There are several areas of the brain that play a critical role in speech and language:

Cerebrum: Each hemisphere of the cerebrum may be divided into regions called lobes[5], which include the frontal, parietal, temporal, and occipital lobes.

The lobes located within the front and side of the brain, the frontal lobes[2] and also temporal lobes[2], are primarily involved in speech formation and understanding.

Cerebellum: The cerebellum is found at the rear of the brain. The cerebellum is involved in coordinating skeletal muscle movements, including opening and closing your mouth, moving your arms and legs, standing upright, and maintaining balance. It also controls language processing [6]

Broca’s area: Located in the left brain[7], is related to speaking and articulation. Articulating ideas, and using words accurately in spoken and communication.

what part of the brain controls speech
Broca’s area

Broca’s area also helps to pass the information to different parts of the brain called the motor cortex, which controls the movements of your mouth. It’s named after a French doctor, Pierre Paul Broca[8], who discovered the region of the brain in 1861.

Wernicke’s area: The critical language zone within the posterior superior temporal lobe connects to Broca’s area via a neural pathway. Wernicke’s area is primarily involved incomprehension. Historically, this area has been related to language processing, whether it’s written or spoken. 

Arcuate fasciculus: The arcuate fasciculus may be a band of nerves that connects Wernicke’s area and Broca’s area. It helps you form words, speak clearly, and understand concepts in language form.

Angular Gyrus: Allows us to associate multiple kinds of language-related information whether auditory, visual, or sensory. It’s located in close proximity to other critical brain regions like the parietal lobe which processes touch sensation, the occipital lobe which is involved in visual analyses, and therefore the temporal lobe which processes sounds. The angular gyrus allows us to associate a perceived word with different images, sensations, and ideas.

Motor cortex: To speak clearly, you need to move the muscles of your mouth, tongue, and throat. This is often where the motor cortex comes into play.

Located within the frontal lobe, the motor cortex takes information from Broca’s area and tells the muscles of your face, mouth, tongue, lips, and throat a way to move to make a speech.

Aphasia: Classification of Disorders

One such communication disorder is Aphasia. Aphasia is a disturbance happening in comprehension or expression of language. 

Aphasia is a disorder resulting from damage to parts of the brain that are important for language. For several people, these areas are on the left side of the brain. The left brain is where the two regions of the brain responsible for language are found: Wernicke’s area and Broca’s area. Aphasia is usually categorized as “expressive” or “receptive”, depending on how difficult it is to understand or express language. But the majority with aphasia have some trouble with their speaking and will have a mixture of problems with writing, reading, and perhaps listening.

Different types of aphasia

Global aphasia

This is the most severe form of aphasia and is applied to patients who can produce a few recognizable words and understand little or no spoken language

Injury encompasses both Wernicke’s and Broca’s areas, global aphasia can occur. In this case, all components of speech and language are affected. Patients can say a few phrases at most and understand only a few words and phrases. They commonly can’t lift out commands or title objects. They can’t read or write or repeat words[9] said to them.

Broca’s aphasia (non-fluent aphasia)

Broca’s aphasia results from injury to speech and language brain areas such as the left brain inferior frontal gyrus[10], among others.

Damage to a discrete part of the brain within the left frontal lobe (Broca’s area) of the language-dominant hemisphere has been shown to significantly affect the utilization of spontaneous speech and motor speech control. Words could also be uttered very slowly and poorly articulated.

Wernicke’s aphasia (fluent aphasia)

This type of aphasia usually has profound language comprehension deficits, even for single words or simple sentences. This can be because in Wernicke’s aphasia individuals have damage in brain areas that are important for processing the meaning of words and speech. Such damage includes left posterior temporal regions of the brain, which are a part of what’s known as Wernicke’s area, hence the name of the aphasia.

what part of the brain controls speech
Wernicke’s area

Reading and writing are often severely impaired. As in other types of aphasia, individuals can have completely maintained intellectual and cognitive capabilities unrelated to language and speech.

Anomic aphasia

Anomic aphasia is one of the milder types of aphasia. The term is applied to persons who are left with a persistent inability to produce the words for the very things they need to speak about, particularly the significant nouns and verbs. Their speech is fluent and grammatically correct but it’s filled with vague words (such as ‘thing’) and circumlocutions (attempts to explain the word they’re trying to find). The sensation is usually that of getting the word on the tip of one’s tongue, which ends in their speech having many expressions of frustration.

Communication Strategies: Some Dos and Don’ts

  1. Make sure you have got the person’s attention before you begin.
  2. Minimize or eliminate background noise (TV, radio, other people).
  3. Keep your voice at a traditional level, unless the person has indicated otherwise.
  4. Keep communication simple, but an adult. Simplify your own structure and reduce your rate of speech. Emphasize keywords. Don’t “talk down” to the person with aphasia.
  5. Give them time to talk. Resist the urge to complete sentences or offer words.
  6. Communicate with gestures, writing, drawings, and facial expressions
  7. Confirm that you just are communicating successfully with “yes” and “no” questions.
  8. Praise all attempts to talk and downplay any errors. Avoid insisting that every word be produced perfectly.
  9. Engage in normal activities whenever possible. Don’t shield people with aphasia from family or ignore them during a group conversation. Rather, attempt to involve them in family decision-making as much as possible. Keep them informed of events but avoid burdening them with day to day details.
  10. Encourage independence and avoid being overprotective.

Speech Recognition From Brain Activity 

Speech is produced in the human cerebral cortex. Brain waves associated with speech processes can be directly recorded with electrodes located on the surface of the cortex. It has now been shown for the first time that it is possible to reconstruct basic units, words, and complete sentences of continuous speech from these brain waves and to generate the corresponding text. Researchers at KIT and Wadsworth Center, USA present their ”Brain-to-Text“ system in the scientific journal Frontiers in Neuroscience (doi: 10.3389/fnins.2015.00217).

It has long been speculated whether humans may communicate with machines via part of the brain activity alone,” says Tanja Schultz, who conducted the present study with her team at the Cognitive Systems Lab of KIT. ”As a major step in this direction, our recent results indicate that both single units in terms of speech sounds, as well as continuously spoken sentences, can be recognized from brain activity.“

These results were obtained by an interdisciplinary collaboration of researchers of informatics, neuroscience, and medicine. In Karlsruhe, the methods for signal processing and automatic speech recognition have been developed and applied. ”In addition to the decoding of speech from brain activity, our models allow for a detailed analysis of the areas of the brain involved in speech processes and their interaction,” outline Christian Herff und Dominic Heger, who developed the Brain-to-Text system within their doctoral studies.

The present work is the first that decodes continuously spoken speech and transforms it into a textual representation. For this purpose, cortical information is combined with linguistic knowledge and machine learning algorithms to extract the most likely word sequence. Currently, Brain-to-Text is based on audible speech. However, the results are an important first step for recognizing speech from thought alone.

Part of the brain activity was recorded in the USA from 7 epileptic patients, who participated voluntarily in the study during their clinical treatments. An electrode array was placed on the surface of the cerebral cortex (electrocorticography (ECoG)) for their neurological treatment. While patients read aloud sample texts, the ECoG signals were recorded with high resolution in time and space. Later on, the researchers in Karlsruhe analyzed the data to develop Brain-to-Text. In addition to basic science and a better understanding of the highly complex speech processes in the part of the brain, Brain-to-Text might be a building block to develop a means of speech communication for locked-in patients in the future.

Health Canal avoids using tertiary references. We have strict sourcing guidelines and rely on peer-reviewed studies, academic researches from medical associations and institutions. To ensure the accuracy of articles in Health Canal, you can read more about the editorial process here

  1. Ronny Plontke. 2013 [March 13, 2013]. Language and Brain. Available from: https://www.tu-chemnitz.de/phil/english/sections/linguist/independent/kursmaterialien/termpapers_online/plontke_lang&brain.pdf
  2. The National Institute of Neurological Disorders and Stroke. 2020. Brain Basics: Know Your Brain. Available from: https://www.ninds.nih.gov/Disorders/Patient-Caregiver-Education/Know-Your-Brain#Image%201
  3. R Joseph. 1988. The right cerebral hemisphere: emotion, music, visual-spatial skills, body-image, dreams, and awareness. Available from: https://pubmed.ncbi.nlm.nih.gov/2461390/
  4. Journal of Sport & Exercise Psychology. 2012. Keynote Speakers, 34(Suppl.), S1-S4. Available from: https://manualzz.com/doc/17618596/http—journals.humankinetics.com-
  5. Kendra Cherry. 2019. A Guide to the Anatomy of the Brain. Available from: https://www.verywellmind.com/the-anatomy-of-the-brain-2794895
  6. Clifford L. Highnam & Ken M. Bleile. 2011 [November 1, 2011]. Language in the Cerebellum. Available from: https://pubs.asha.org/doi/10.1044/1058-0360(2011/10-0096)
  7. Michael C. Corballis. 2014 [January 21, 2014]. Left Brain, Right Brain: Facts and Fantasies. Available from: https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.1001767
  8. Aninda B. Acharya; Michael Wroten. 2020. Broca Aphasia. Available from: https://www.ncbi.nlm.nih.gov/books/NBK436010/
  9. David Kemmerer. 2015. Cognitive Neuroscience of Language. Available from: https://bit.ly/2HgqHQw
  10. Casper A M M van Oers, Matthijs Vink, et al. 2010 [January 1, 2010]. Contribution of the left and right inferior frontal gyrus in recovery from aphasia. A functional MRI study in stroke patients with preserved hemodynamic responsiveness. Available from: https://pubmed.ncbi.nlm.nih.gov/19733673/