The Language You Speak Influences Where Your Attention Goes
Psycholinguistics is a field at the intersection of psychology and linguistics, and one if its recent discoveries is that the languages we speak influence our eye movements. For example, English speakers who hear candle often look at a candy because the two words share their first syllable. Research with speakers of different languages revealed that bilingual speakers not only look at words that share sounds in one language but also at words that share sounds across their two languages. When Russian-English bilinguals hear the English word marker, they also look at a stamp, because the Russian word for stamp is marka.
Even more stunning, speakers of different languages differ in their patterns of eye movements when no language is used at all. In a simple visual search task in which people had to find a previously seen object among other objects, their eyes moved differently depending on what languages they knew. For example, when looking for a clock, English speakers also looked at a cloud. Spanish speakers, on the other hand, when looking for the same clock, looked at a present, because the Spanish names for clock and present—reloj and regalo—overlap at their onset.
The story doesn’t end there. Not only do the words we hear activate other, similar-sounding words—and not only do we look at objects whose names share sounds or letters even when no language is heard—but the translations of those names in other languages become activated as well in speakers of more than one language. For example, when Spanish-English bilinguals hear the word duck in English, they also look at a shovel, because the translations of duck and shovel—pato and pala, respectively—overlap in Spanish.
Because of the way our brain organizes and processess linguistic and nonlinguistic information, a single word can set off a domino effect that cascades throughout the cognitive system. And this interactivity and co-activation is not limited to spoken languages. Bilinguals of spoken and signed languages show co-activation as well. For example, bilinguals who know American Sign Language and English look at cheese when they hear the English word paper because cheese and paper share three of the four sign components in ASL (hand shape, location and orientation but not motion).
What do findings like these tell us? Not only is the language system thoroughly interactive with a high degree of co-activation across words and concepts, but it also impacts our processing in other domains such as vision, attention and cognitive control. As we go about our everyday lives, how our eyes move, what we look at and what we pay attention to are influenced in direct and measurable ways by the languages we speak.
The implications of these findings for applied settings range from consumer behavior (what we look at in a store) to the military (visual search in complex scenes) and art (what our eyes are drawn to). In other words, it is safe to say that the language you speak influences how you see the world not only figuratively but also quite literally, down to the mechanics of your eye movements.