In research focused on 6-to-9-month-old babies, University of Pennsylvania psychologists Elika Bergelson and Daniel Swingley demonstrated that the infants learned the meanings of words for foods and body parts through their daily experience with language.
Bergelson is a doctoral student and Swingley an associate professor in Penn’s Department of Psychology. Their study was published this week in the Proceedings of the National Academy of Sciences Early Edition.
These findings unseat a previously held consensus about infant learning. It was widely believed that infants between 6 and 9 months, while able to perceive and understand elements of the sounds of their native language, did not yet possess the ability to grasp the meanings conveyed though speech. Most psychologists believed word comprehension didn’t emerge until closer to a child’s first birthday.
In fact, infants are often referred to as “pre-linguistic,” according to Bergelson. But there have been few attempts to determine just when infants begin understanding what is meant by specific words. The belief that infants do not comprehend language for most of the first year is easy to understand, given that infants do not often speak in words, or even gesture meaningfully, before 10 or 11 months.
To test this belief, Bergelson and Swingley recruited caregivers to bring their children to a lab to complete two different kinds of test. In the first, a child sat on the caregiver’s lap facing a screen on which there were images of one food item and one body part.
The caregiver wore headphones and heard a statement such as, “Look at the apple,” or, “Where’s the apple?” and then repeated it to the child. The caregiver also wore a visor to avoid seeing the screen. An eye-tracking device, which can distinguish precisely where a child is looking and when, then followed the child’s gaze.
The second kind of test had the same set-up, except that, instead of the screen displaying a food item and a body part, it displayed objects in natural contexts, such as a few foods laid out on a table, or a human figure. For both kinds of test, the question was whether hearing a word for something on the screen would lead children to look at that object more, indicating that they understood the word.
In total, Bergelson and Swingley tested 33 6-to-9-month olds. The researchers also had 50 children from 10 to 20 months complete the same tests to see how their abilities compared with the younger group.
As part of their analysis, Bergelson and Swingley corrected for eye movements not related to caregivers’ speech. Bergelson pointed out that to infants some objects are more interesting than others, whatever their parents might say.
“So if you have a boring cup and a really colorful cup, they’re going to look at the more interesting thing, all else being equal.”
To eliminate this potential source of error, the researchers subtracted the amount of time that the babies gazed at a given object when it was not being named from the time they looked when it was named.
“The idea there is that they have some sort of baseline for how much they like to look at the thing, so when you take that away, what’s left is their word recognition,” Bergelson said.
In both the two-picture and scene tests, the researchers found that the 6- to 9-month-old babies fixed their gaze more on the picture that was named than on the other image or images, indicating that they understood that the word was associated with the appropriate object.
This is the first demonstration that children of this age can understand such words.
“There had been a few demonstrations of understanding before, involving words like mommy and daddy,” Swingley said. “Our study is different in looking at more generic words, words that refer to categories.”
“We’re testing things that look different every time you see them,” Bergelson said. “There’s some variety in apples and noses, and ‘nose’ doesn’t just mean your nose; it could mean anybody’s nose. This is one of the things that makes word learning complicated: words often refer to categories, not just individuals.”
Bergelson and Swingley were also curious to know whether they could observe a pattern of learning during the months from 6 to 9. But, when they compared the performance of 6- and 7-month-old babies with that of 8- and 9-month olds, they found no improvements.
“That is a surprising result. We don’t know why it is that performance remains flat for so long,” Swingley said.
Factoring in the results of the older babies, the researchers found little improvement until the children reached roughly 14 months, at which point word recognition jumped markedly.
“Maybe what is going on with the 14-month olds is they understand the nature of the task as a kind of game and they’re playing it,” Swingley said. “Or the dramatic increase in performance at 14 months may be due to aspects of language development we did not measure specifically, including better categorization of the speech signal, or better understanding of syntax.”
He noted that it is also possible that children do improve between 6 and 14 months, but that that improvement is countered by the fact that older babies in this range may be more distractable and less attentive.
The study’s novel results contribute to an ongoing debate about infant language acquisition and cognitive development.
“I think it’s surprising in the sense that the kids at this age aren’t saying anything, they’re not pointing, they’re not walking,” Bergelson said. “But actually, under the surface, they’re trying to put together the things in the world with the words that go with them.”
“I think this study presents a great message to parents: You can talk to your babies and they’re going to understand a bit of what you’re saying,” Swingley said. “They’re not going to give us back witty repartee, but they understand some of it. And the more they know, the more they can build on what they know.”
The research was supported by the National Science Foundation and the National Institutes of Health.